Message/Author 


I would really appreciate help with a paper in which alternative measurement models were compared using the modified chi square difference test for WLSMV estimator. The reviewer argued that the chi square difference test is biased by the large sample size (N=600) and asked how much a chi square difference is meaningful. How may this question be answered? And are there alternatives to the chi square difference test for the WLSMV estimator? Just for information, all chi square difference tests had p < .001. 


This is general chi2 test issue, not specific to WLSMV. The chi2 test is sensitive to small deviations from the model when the sample size is not small. You may want to discuss this on SEMNET. My take is to do an approximate check of how sensitive the test is for your datamodel situation. You can use Modindices to see which parameters need to be freed to get a reasonable chi2 fit and then you see how much your key parameters have changed. If they have changed only in substantively ignorable ways, you could surmise that your original model was fine for practical purposes and that chi2 was oversensitive. Not everyone would agree with this approach, however. 


Dear Linda, Thank you very much for your response. I dealt with the issue by randomly selected onethird of the sample so that the N became ~200 and reran the analysis  the results were the same and all chisquare difference ps were .0000 on the output. I submitted the manuscript and the reviewer came back with the following comment: "I am afraid that the authors are not understanding what I am asking. Forget about the statistical tests. Statistical significance is not the issue. I am asking much better is 3 factors versus 2 factors, and 2 factors versus 1 factor in terms of variance accounted for or sensitivity or specificity?" What might be sensitivity or specificity when evaluating relative fit of competing models. For the point on variance explained, do I add up the column labeled "R square" in the output and divide the sum by the total no. of indicators (assuming each indicator's variance is 1)? Many thanks in advance for your reply. 


This is a good general analysis question for SEMNET. 


I have sent the message to the SEMNET listserv and waiting for replies. For now, may I get a quick opinion about the sum of R squares? In the EFA situation, my understanding is that R squares of items cannot be summed to obtain total variance explained if the factors are correlated. Of course in my model the factors were free to correlate and are in fact strongly correlated. Hence I cannot add up the R square values to obtain total variance explained? 


The reviewer's opinions sound a bit out of date/off target. Variance accounted for is a concept suitable for principal component analysis (PCA) where this is the primary goal and the uncorrelated components make it easy to add up component contributions. For EFA it is not the primary goal  explaining correlations is. Nevertheless, with orthogonal factors you can mention how the factors contribute explained variance  it is a descriptive of the factor model even though not the goal. Maybe that's your way to appease the reviewer. Sensitivity and specificity is with respect to a classification but I don't know that you have that situation. 


Dear Dr. Muthen, First of all, Merry X'mas to you. Cameron McIntosh responded to my SEMNET message and gave me some formulas for calculating AIC and BIC for WLSMV. To do so, I would need "sum of squared residuals" and "minimum value of the fitting function (discrepancy observed and modelimplied moments). Where may I find these values on the Mplus output? Many thx for your help. 


The minimum function value is printed in the latest versions of Mplus. The residuals are obtained by the Output option Residual. 


I have version 8. But I can't search for the word "minimum" or "minimum function value". Can you please tell me where to locate this value? For sum of squared residuals, do I take the residual variance of each item, square it, then add them up? That's it? 


Version 8.2 says: Optimum Function Value for Weighted LeastSquares Estimator Value 0.24587142D02 But you can use 8.0 that you have if you request Tech5 and then look at the left column's function value for the last iteration. The residual itself has to be squared and summed up. Note however that as Cam said on SEMNET, this fit measure is not well investigated (or universally accepted; at least not yet). 


Many thanks for the guide indeed. I intend to use the AIC as a last resort, and even if I do, I do not intend to report it in the paper itself for the reason you mentioned. It will only be calculated as another reference for the reviewer. It is comforting to find that so far in all of SEMNET commentaries, there isn't one that comes up with established methods to address the reviewer's comment. Under the circumstances, we have to go for the most plausible conclusion, which is clear to me. Merry X'mas to you again! 


Dear Dr. Muthen, sorry to bother you again. I found that when addign up the sum of R square for each item and the sum of residual variance = the total no. of items. So that means the sum of R squares = total variance explained? But I thought the R squares cannot be added up in models with correlated factors. Another way of putting my question is: Is the sum of residual variance really the total residual? 


Q1 and Q2: Yes. This overall total variance explained is the same for uncorrelated and correlated factors  they are just two different rotations of the same "explained" Lambda*Psi*Lambda' part of the estimated covariance matrix. What you can't do is to apportion the variance due to each correlated factor. 

Back to top 