Jon Elhai posted on Wednesday, January 10, 2007 - 1:19 am
Drs. Muthen: I am testing a full model of 3 factors (17 observed, dichotomous variables, using WLMSV). My model looks like this: Y1 by x1 x2 x3 x4 x5; Y2 by x6 x7 x8 x9 x10 x11 x12; Y3 by x13 x14 x15 x16 x17;
I would like to test this full model against a more restrictive model using difftest - fixing the factor loadings for x8, x9, x13, x14, and x15 to zero, to assess if dropping these five observed variables from the CFA estimation results in a significantly worse fit.
I therefore, in my second step of difftest, specified this model, fixing those 5 parameters to zero: Y1 by x1 x2 x3 x4 x5; Y2 by x6 x7 x8@0x9@0 x10 x11 x12; Y3 by x13@0x14@0x15@0 x16 x17;
However, in the second step’s output, I received a message saying the following. So I’m wondering if perhaps I am not correctly understanding either what it means to fix a parameter to zero; or, if perhaps I am doing so incorrectly in Mplus:
“THE MODEL ESTIMATION TERMINATED NORMALLY THE CHI-SQUARE COMPUTATION COULD NOT BE COMPLETED BECAUSE OF A SINGULAR MATRIX.
THE STANDARD ERRORS OF THE MODEL PARAMETER ESTIMATES COULD NOT BE COMPUTED. THE MODEL MAY NOT BE IDENTIFIED. CHECK YOUR MODEL. PROBLEM INVOLVING PARAMETER 16. THE CONDITION NUMBER IS -0.430D-17.”
ANALYSIS: DIFFTEST IS deriv.dat; MODEL: A by a1 a2 a3; B by b1 b2 b3 b4; P by p1 p2 p3 p4 p5; N by n1 n2 n3 n4 n5; PR by pr1 pr2 pr3 pr4 pr5 pr6 pr7 pr8 pr9 pr10; n2@0;
I tested the full model against a more restrictive model using difftest - fixing the factor loading of n2 to zero, to examine whether discarding the observed variable (i.e., n2) from the CFA estimation cause a significantly better fit.
My result showed as the following:
Chi-Square Test for Difference Testing Value 121.243 Degrees of Freedom 1 P-Value 0.0000
My question is whether my process of DIFFTEST for WLSMV is correct?
I got it. Many thanks for your help!! This is the new result: Chi-Square Test for Difference Testing Value 23.752 Degrees of Freedom 1 P-Value 0.0000
I think this is the right way for WLSMV, in Mplus, to examine whether discarding an observed variable from the CFA estimation cause a significantly better fit. If there is anything still incorrect in the process, please let me know.
It is a test of whether or not the n2 item is uncorrelated with not only the N factor but also every other factor and all other observed variables. The p-value of 0 says that this more restricted model is rejected.
Thank you very much for your interpretation. I am so glad that I asked the question; otherwise, I wouldn’t know the process is not associated with my purpose.
My purpose is to examine whether discarding an observed variable (i.e., n2) from the CFA model using WLSMV estimation, in Mplus, cause a significantly better fit.
The following are the results of the model01 and 02: Chi-Square Test of Model01 Fit (the model with n2) Value 488.603 Degrees of Freedom 314
Chi-Square Test of Model02 Fit (the model without n2) Value 419.705 Degrees of Freedom 289
If the above models use the ML estimation, the results (Chi-Square difference = 68.898; df difference = 25) showed that discarded the observed variable (n2) from the model01 causing a significantly better fit.
However, the two models use the WLSMV estimation; they are not using the ML estimation.
Would you please teach me what is the right process for the WLSMV estimation, in Mplus, for my purpose?
You cannot do chi-2 difference testing when the two models have different sets of observed dependent variables (like with and without n2).
The more important point here, however, is that you want to take a quite different approach to answer your question "do we get a better CFA model fit without the n2 item?" I think the best way to answer that is to do an exploratory factor analysis. Your 5 factors should then show up clearly and you can see if the n2 item has a lot of significant cross-loadings on other factors and if it has a lot of significant residual correlations. If it does, it is an item that will contribute significantly to misfit of your the CFA model you postulate.
Thank you very much for your teaching!! I am surprised at the wrong of the approach that I learned from the other researchers using Mplus. I will introduce your valuable comment to the ones who took the same wrong approach as me.