Comparison of Nested Models using pro... PreviousNext
Mplus Discussion > Structural Equation Modeling >
Message/Author
 Sally Czaja posted on Friday, August 18, 2006 - 11:09 am
I am using procedures for WLSMV to compare models that I feel sure are nested (exactly the same except one path is removed in the H0), but MPLUS is telling me that the H0 model is not nested in the H1 model. It reports the same degrees of freedom for both models, which I also find puzzling. Any advice?
These are the models:

H1
ANALYSIS:
TYPE=general missing h1;
MODEL:
y4 ON y1 y2 y3 x1;
y1 ON cont1 cont2 x1;
y2 ON cont3 x1;
y3 on cont1 cont3 x1;
y1 WITH y2;
y2 WITH y3;
y1 WITH y3;

H0
ANALYSIS:
TYPE=general missing h1;
MODEL:
y4 ON y1 y2 y3 x1@0;
y1 ON cont1 cont2 x1;
y2 ON cont3 x1;
y3 on cont1 cont3 x1;
y1 WITH y2;
y2 WITH y3;
y1 WITH y3;

Thank you!
 Bengt O. Muthen posted on Friday, August 18, 2006 - 4:59 pm
In using DIFFTEST, are you sure you are not putting the H0 model in the place that the H1 model should be? To check nesting, Mplus simply compares the fitting function value at the optimum (lower is better) - the model with a lower value cannot be nested within a model with a higher value. A model with one parameter fixed cannot have a lower (better) fitting function value than the corresponding model with that parameter free. The fitting function values can be seen in Tech5, left column.
 Sally Czaja posted on Monday, August 21, 2006 - 6:33 am
Thank you for your response. I feel certain that I am not switching the models but to clarify, I am saving the data file when I run the full model (which should be the better fit) and then running the DIFFTEST on the trimmed model with the fixed parameter. Is this correct?
 Bengt O. Muthen posted on Monday, August 21, 2006 - 6:41 am
Sounds right - see ex 12.12 in the User's Guide. Also check TECH1 to see the parameters used. If that doesn't help, you need to send your input, output, data, and license number to support@statmodel.com.
 Sally Czaja posted on Monday, August 21, 2006 - 7:18 am
Thank you! Ex 12.12 solved the problem. I was using the FILE IS command for saving the data file (rather than DIFFTEST IS).
 Gemma vilagut posted on Tuesday, May 08, 2007 - 6:22 am
I am using WLSMV to compare a model with dichotomous covariate (begenl) including direct effect (H1), with the same model without the covariate (H0). MPLUS does not report the Chi-sq comparison and it says that the H0 model is not nested in the H1 model.
Any help on this would be very much appreciated.
The models are:
H1:

MODEL:
ROLEF by fd4 fd7 fd8 fd9;
COGNIT by fd11a fd11b fd11c fd11d ;
MOBILT by fd13a fd13b fd13c;
SLFCARE by fd15a fd15b fd15c;
SOCIAL by fd17a fd17b fd17c fd17d fd17e;
PARTICI by fd18b fd18c fd18d fd18e fd20 fd21 fd22;

ROLEF COGNIT MOBILT SLFCARE SOCIAL PARTICI ON begenl;
fd20 ON begenl;

SAVEDATA: DIFFTEST IS modelh1.dat;

H0:
MODEL:
ROLEF by fd4 fd7 fd8 fd9;
COGNIT by fd11a fd11b fd11c fd11d ;
MOBILT by fd13a fd13b fd13c;
SLFCARE by fd15a fd15b fd15c;
SOCIAL by fd17a fd17b fd17c fd17d fd17e;
PARTICI by fd18b fd18c fd18d fd18e fd20 fd21 fd22;

ANALYSIS: DIFFTEST IS modelh1.dat;
 Linda K. Muthen posted on Tuesday, May 08, 2007 - 7:52 am
Nesting requires the same set of observed variables. You should add the following to the H0 model:

ROLEF COGNIT MOBILT SLFCARE SOCIAL PARTICI ON begenl@0;
fd20 ON begenl@0;
 Gemma vilagut posted on Tuesday, May 08, 2007 - 10:21 am
Thanks Linda.

I have tried your suggestion, but now the estimated parameters are not the same as those for the initial H0 model (whithout fixing the coefficients of begenl to 0).
Thanks very much! Gemma
 Linda K. Muthen posted on Tuesday, May 08, 2007 - 11:09 am
You will need to send your inputs, data, outputs, and license number to support@statmodel.com.
 ClaudiaBergomi posted on Wednesday, February 02, 2011 - 9:13 am
I am using WLSMV to test mediation and I wish to compare models with DIFFTEST. I am comparing a model with two IV, one mediator, ond (binary) DV:

1)
MIN WITH SYM;
alc ON MIN;
alc ON SYM;
alc ON EXP;
EXP ON MIN;
EXP ON SYM;
MODEL INDIRECT:
alc IND EXP MIN;
alc IND EXP SYM;

with a nested model without the IV MIN:
2)
MIN WITH SYM@0;
alc ON MIN@0;
alc ON SYM;
alc ON EXP;
EXP ON SYM;
EXP ON MIN@0;
MODEL INDIRECT:
alc IND EXP SYM;
alc IND EXP@0 MIN@0;

In order to show that adding MIN makes the model better.

My problem is that the nested model 2) has really bad fit indices, compared with the same model calculated without adding MIN and then constraining coeffiecients to 0 (and thus non-nested with 1)):

3)
EXP ON SYM;
alc ON SYM;
alc ON EXP;
MODEL INDIRECT:
alc IND EXP SYM;

Nevertheless, when I am describing the fits of my models I suppose I have to take the fit indices from 3) because those in 2) are 'artificially worstened'. But then, why am I allowed to calculate the DIFFTEST on the base of 2), which is of course worst?

Where I am doing something wrong?
Thank you.
 Bengt O. Muthen posted on Wednesday, February 02, 2011 - 4:58 pm
You should not have

MIN WITH SYM;

in model 1) or model 2) because they are exogenous variables and should be correlated as the default.
 ClaudiaBergomi posted on Thursday, February 03, 2011 - 5:16 am
Thank you. Now the fit in 2)=nested-one-VI has become better but it is still not as good as in 3)=non-nested-one-VI.

2)
alc ON MIN@0;
alc ON SYM;
alc ON EXP;
EXP ON SYM;
EXP ON MIN@0;
MODEL INDIRECT:
alc IND EXP SYM;
alc IND EXP@0 MIN@0;

Chi-Square Test of Model Fit
Value 78.147*
Degrees of Freedom 32
P-Value 0.0000
CFI 0.938
TLI 0.913

3)
EXP ON SYM;
alc ON SYM;
alc ON EXP;
MODEL INDIRECT:
alc IND EXP SYM;

Chi-Square Test of Model Fit
Value 19.108*
Degrees of Freedom 12
P-Value 0.0000
CFI 0.986
TLI 0.975

So I still have my previous doubt:
1. Is it right to describe fit of the model with only one VI using the indices from model 3)=non-nested and not from 2)=nested?
2. If so, then the question arise if it is right to compute the DIFFTEST between model 1)=two-VIs and 2), as this last has worst fit then 3) and thus the DIFFTEST is more likely to confirm my hypothesis that 1) is better.
 Bengt O. Muthen posted on Thursday, February 03, 2011 - 12:17 pm
When you say "VI", I think you mean "IV".

1. Model fit with one IV should have only one IV on the USEV list, otherwise you are also testing the zero restrictions for the other IV.

2. DIFFTEST can only be used when the same USEV variables are used in both models - so model 2) is the correct comparison model to the model with MIN having effects because this tests whether MIN has effects.
 Paul A.Tiffin posted on Wednesday, August 24, 2011 - 2:18 am
Dear Mplus team,

Am I right in assuming that a CFA with two postulated factors would not strictly be nested in a model with one factor, even if they had the same indicators?
 Bengt O. Muthen posted on Wednesday, August 24, 2011 - 12:26 pm
A one-factor model can be nested within a 2-factor model, not the other way around.
 Paul A.Tiffin posted on Thursday, August 25, 2011 - 5:22 am
Thanks Bengt,

I assume that a two factor model with perfect correlation specified between the two factors is then equivalent to a one factor model and the difference between the models' fit can then be tested (using DIFFTEST for WLSMV).

In that case what is the best way to specify perfect factor correlation?
would it be, say:
f1 ON f2@1; ?

Your help is appreciated.

Many thanks

Paul
 Bengt O. Muthen posted on Thursday, August 25, 2011 - 8:22 am
First, you will have to set the metric in the 2-factor model using factor variances @1. Then you say f1 with f2@1. See how that works - it gives a non-pos def factor covariance matrix. Note also that you can't have any cross-loadings in the 2-factor model.
 Paul A.Tiffin posted on Thursday, August 25, 2011 - 8:35 am
thanks- that makes senses

best wishes

Paul
 Paul A.Tiffin posted on Thursday, August 25, 2011 - 9:08 am
Dear Mplus team,

This approach did not seem to work in my case. However, there is some debate amongst methodologists whether models with varying numbers of factors are truly nested. Therefore it may be better to compare models using the BIC (ie.derived using MLR with montecarlo integration- the indicators are ordinal).

Is there any way of deriving a significance test for improvement in model fit using the BIC?

Many thanks

Paul
 Linda K. Muthen posted on Thursday, August 25, 2011 - 2:04 pm
Not that I know of.
 Katja Schlegel posted on Tuesday, October 04, 2011 - 3:33 am
Dear Mplus team,
I would like to compare the following models containing the same set of observed variables:
1. ERA by inq deg pla col irr peu tri joy
des fie sur amu sou int;
irr with col;
des with tri;
sou with pla;
joy with fie;

2. POS by pla joy fie amu sou int ;
NEG by inq deg col irr peu tri des;
irr with col;
des with tri;
sou with pla;
joy with fie;
sur with POS;
sur with NEG;

Are these models nested? If not, why? In this case, how can I use the BIC to compare the models if there is no significance test for this index?
Thank you very much!
 Linda K. Muthen posted on Tuesday, October 04, 2011 - 9:23 am
We do believe these models are nested. The lower BIC is the best BIC. You can do a statistical test by -2 times the loglikelihood difference which is distributed as chi-square.
 Mohamed Abdel-Raouf posted on Wednesday, October 05, 2011 - 12:21 pm
Hi,
I am using WLSMV to fit a model with a binary dependent. Now i am trying to compare another three models with my research model. ALl models are based on the same indicators, and i increase some paths in one model, decrease some in another, and using a mediating in a third. I am trying to compare those three models with my research model. I have used Difftest but i got a message saying that difference can not be computed because models are not nested. So how can i compare these NON nested models?
Thanks,
Mohamed
 Bengt O. Muthen posted on Wednesday, October 05, 2011 - 8:50 pm
Using BIC may be a good idea.
 Mohamed Abdel-Raouf posted on Thursday, October 06, 2011 - 4:43 am
Thanks but BIC does not appear in my output when i use WLSMV? how to calculate it?
Many thanks,
 Linda K. Muthen posted on Thursday, October 06, 2011 - 8:27 am
BIC is for maximum likelihood not weighted least squares. I would think some of your models are nested. Perhaps you are using DIFFTEST incorrectly. You can send the relevant outputs and your license number to support@statmodel.com if you want to check this out. Otherwise, I would see which model seems to have the best overall fit taking all fit indices into account.
 Mohamed Abdel-Raouf posted on Thursday, October 06, 2011 - 10:48 am
Thanks Linda, if i am going to use the last option of yours (I would see which model seems to have the best overall fit taking all fit indices into account),Can i have a reference to support this point of view?

Thanks indeed.
 Linda K. Muthen posted on Thursday, October 06, 2011 - 1:21 pm
I don't have a reference to support this point of view. It's simply the only alternative I can think of given you don't have BIC with WLSMV. You can probably get more opinions on SEMNET.
 Mohamed Abdel-Raouf posted on Friday, October 07, 2011 - 7:45 am
Hi Linda,
i am trying to test the following models:they base on the same indicators:

Variable: names are x1-x39 u1;
usevariable are x1-x39 u1;
Categorical is u1;

First Model:
f1 by x1-x4;
f2 by x5-x9;
f3 by x10-x15;
f4 by x16-x21;
f5 by x22-x25;
f6 by x26-x30;
f7 by x31-x36;
f8 by x37-x39;

f9 by f1-f3;
f10 by f5-f8;

f9 on f4;
u1 on f9 f10;

Second Model:
f1 by x1-x4;
f2 by x5-x9;
f3 by x10-x15;
f4 by x16-x21;
f5 by x22-x25;
f6 by x26-x30;
f7 by x31-x36;
f8 by x37-x39;

f9 by f1-f3;
f10 by f5-f8;

f10 on f4;
u1 on f9 f10;

Third model:
f1 by x1-x4;
f2 by x5-x9;
f3 by x10-x15;
f4 by x16-x21;
f5 by x22-x25;
f6 by x26-x30;
f7 by x31-x36;
f8 by x37-x39;

f9 by f1-f3;
f10 by f5-f8;

f9 f10 on f4;
u1 on f9 f10;

Do you think these are nested models? what is puzzling me is that chi-square value for all the models is different while df is the same in all the three models? why?

Thanks,
Mohamed
 Linda K. Muthen posted on Friday, October 07, 2011 - 11:01 am
If the degrees of freedom are the same, the models are not nested. Having the same degrees of freedom does not mean that chi-square will be the same. You may be interested in the following article:

Bentler, P.M. and Satorra, A. (2010). Testing model nesting and equivalence. Psychological Methods, Vol. 15, No. 2, 111-123.
 Bellinda King-Kallimanis posted on Tuesday, February 28, 2012 - 3:20 am
Hello,

I believe that my models are nested but I receive the message warning that they are not

H1 - MultiGrp (11grp)
ANALYSIS: ESTIMATOR=WLSMV;
PARAMETERIZATION=theta;
MODEL: F BY w* (L1)
s* (L2)
wg* (L3)
e* (L4);
[F@0];
F@1;
w-e@1;
MODEL S: F BY w* (L1)
s* (L2)
wg* (L3)
e* (L4);
[F*];
F*;
w-e@1;
SAVEDATA:
difftest IS scal.dat;

H0 - Free FL and threshold for S
ANALYSIS: DIFFTEST=scal.dat;
MODEL: F BY w* (L1)
s* (L2)
wg* (L3)
e* (L4);
[F@0];
F@1;
w-e@1;
MODEL S: F BY w* !free
s* (L2)
wg* (L3)
e* (L4);
[F*];
F*;
[w$1*]; !free
w-e@1;

Any help with would be really appreciated. I was wondering if I was getting a negative difference?
Cheers,
Bellinda
 Linda K. Muthen posted on Tuesday, February 28, 2012 - 10:16 am
I believe you need to remove w-e@1; from the first model.
 Yessenia Castro posted on Friday, May 11, 2012 - 1:42 pm
Hello,
I am having the same problem Gemma has detailed above (Gemma vilagut posted on Tuesday, May 08, 2007 - 10:21 am). My models differ by the removal of one path. When I simply remove the path from the input and try to run it, I get a message that the models are not nested. When I constrain the path to 0 as suggested in the response above (Linda K. Muthen posted on Tuesday, May 08, 2007 - 7:52 am), I get the diff test in my output but my fit indices and parameters estimates are slightly different than they would be if I ran a model that just had the path removed from the input. Could you help me clarify the source of this trouble? Thanks for your time.
 Linda K. Muthen posted on Saturday, May 12, 2012 - 2:42 pm
The difference is that the set of variables used in the analysis differs when you remove the path rather than fixing it at zero.
 Yessenia Castro posted on Monday, May 14, 2012 - 9:52 am
Which parameters are the most appropriate to report; the ones that result from removing the path or the ones that result from constraining the path to zero? My interest is in the former, but I'm not sure it's appropriate to report those if the difftest is associated with the latter. Thank you.
 Linda K. Muthen posted on Monday, May 14, 2012 - 10:11 am
You should report the models used in DIFFTEST.
 Hsien-Yuan Hsu posted on Wednesday, March 13, 2013 - 11:59 pm
Dear Dr. Muthen,

I am trying to conduct a model comparison between two model with MLR estimation.

My questions:
Q1. Are these two models nested with each other?"
Q2. If yes, is Satorra-Bentler Scaled Chi-Square applicable in this case?


Model 1:
MODEL:
%Within%
fw1 BY
y1@1
y2
y3;
fw2 BY
y4@1
y5
y6;
y1-y6;
fw1;
fw2;
fw1 WITH fw2;
%Between%
y1-y6 with y1-y6;

Model 2:
MODEL:
%Within%
y1-y6 WITH y1-y6 @0;
%Between%
y1-y6 WITH y1-y6;

Thanks for your reply in advance.

Best,
Hsien-Yuan
 Linda K. Muthen posted on Thursday, March 14, 2013 - 9:03 am
Yes and yes.
 ri ri  posted on Wednesday, May 06, 2015 - 11:31 am
I am comparing a full v.s partial Mediation model. Here are two Syntax forms. In Syntax 1 I added two direct paths into the full Mediation model and fixed them as 0 while comparing to the partial model.In Syntax 2 I used the original full medation model.is Syntax 1 the right one as nested model shall have the same set of variables and paths? Thanks!

H1 (partial):
USEVARIABLES ARE
SL1 SL2 NSC NSA EE1 EE2 TOI1 TOI2 TOA Sick3;
CATEGORICAL = TOA Sick3;
ANALYSIS: DIFFTEST IS deriv3.dat;
ESTIMATOR = WLSMV;ARAMETERIZATION=THETA;
MODEL:
SLw BY SL1 SL2;
NSW BY NSC NSA;
BOW BY EE1 EE2;
TOIW BY TOI1 TOI2;
TOA ON TOIW;
Sick3 ON BOW;
TOIW ON NSW;
BOW ON NSW; !
NSW ON SLw;!
TOA Sick3 ON NSW;
SAVEDATA: DIFFTEST IS deriv3.dat;

H0 (full)
Syntax 1:
USEVARIABLES ARE
SL1 SL2 NSC NSA EE1 EE2 TOI1 TOI2 TOA Sick3;
CATEGORICAL = TOA Sick3;
ANALYSIS: DIFFTEST IS deriv3.dat;
ESTIMATOR = WLSMV;
PARAMETERIZATION=THETA;
MODEL:
SLw BY SL1 SL2;
NSW BY NSC NSA;
BOW BY EE1 EE2;
TOIW BY TOI1 TOI2;
TOA ON TOIW;
Sick3 ON BOW;
TOIW ON NSW;
BOW ON NSW; !
NSW ON SLw;
TOA ON NSW @0;
Sick3 ON NSW @0;

Syntax 2:same as Syntax 1 but without TOA ON NSW @0 and Sick3 ON NSW @0;
 Bengt O. Muthen posted on Wednesday, May 06, 2015 - 2:56 pm
Isn't your Syntax 2 the same as your H1(partial) model?

Syntax 1 looks correct.
 ri ri  posted on Wednesday, May 06, 2015 - 3:11 pm
The Syntax 2 is a full Mediation model as follow:

USEVARIABLES ARE
SL1 SL2 NSC NSA EE1 EE2 TOI1 TOI2 TOA Sick3;
CATEGORICAL = TOA Sick3;
ANALYSIS: DIFFTEST IS deriv3.dat;
ESTIMATOR = WLSMV;ARAMETERIZATION=THETA;
MODEL:
SLw BY SL1 SL2;
NSW BY NSC NSA;
BOW BY EE1 EE2;
TOIW BY TOI1 TOI2;
TOA ON TOIW;
Sick3 ON BOW;
TOIW ON NSW;
BOW ON NSW; !
NSW ON SLw;!

I checked again what Linda posted earlier, that if I compare a full v.s partial Mediation model, I shall ensure the two models have the same set of paths and variables. In this case I shall use Syntax 1 instead of Syntax 2, right? I tested my models with both Syntax 1 and 2, the results are slightly different. Would like to have a final check with you. Thanks!
 Bengt O. Muthen posted on Wednesday, May 06, 2015 - 3:26 pm
You can compare models using WLSMV as long as you have the same IVs and DVs which your Syntax 2 and H1 models have, right? So syntax 1 and 2 are equally good; I don't see off-hand why they would give different results.
 ri ri  posted on Wednesday, May 06, 2015 - 3:40 pm
The result using Syntax 1:

Chi-Square Test for Difference Testing

Value 4.172
Degrees of Freedom 2
P-Value 0.1242

The result using Syntax 2:

Chi-Square Test for Difference Testing

Value 1.803
Degrees of Freedom 1
P-Value 0.1794

Although neither result is significant (p>.05), ¦¤¦Ö2(df) are different. Which one shall I Report?
 ri ri  posted on Wednesday, May 06, 2015 - 4:06 pm
I analyzed again. The previous results might be wrong. This time:

With Syntax 1:

Chi-Square Test for Difference Testing

Value 1.777
Degrees of Freedom 2
P-Value 0.4112

With Syntax 2:

Chi-Square Test for Difference Testing

Value 1.440
Degrees of Freedom 2
P-Value 0.4868

The difference now is smaller. Which value/df shall I Report? 1.78(2) or 1.44(2)?
 Linda K. Muthen posted on Wednesday, May 06, 2015 - 4:16 pm
Please send the two outputs and your license number to support@statmodel.com.
 ri ri  posted on Wednesday, May 06, 2015 - 4:28 pm
I probably knew what caused the mistakes.

In the H1, The second last line was
TOA Sick3 ON NSW;

In the H0, I wrote:

TOA ON NSW @0;
Sick3 ON NSW @0;

After I changed the H1 into:
TOA ON NSW;
Sick3 ON NSW;

the results of two syntaxes were exactly the same. I suppose there is a difference between writing the two regressions separately and together?
 Linda K. Muthen posted on Wednesday, May 06, 2015 - 4:42 pm
There is no difference between

TOA Sick3 ON NSW;

and

TOA ON NSW;
Sick3 ON NSW;
 ri ri  posted on Wednesday, May 06, 2015 - 5:11 pm
If in H0 I wrote

TOA ON NSW @0;
Sick3 ON NSW @0;

shall I Keep them separately written in H1, i.e., TOA ON NSW; Sick3 ON NSW;

or it does not matter if the form in H1 and H0 differs?
 Bengt O. Muthen posted on Thursday, May 07, 2015 - 3:31 pm
It does not matter if how you write it differs.
 Lior Abramson posted on Sunday, September 27, 2015 - 2:11 am
Dear Mplus team,
I have a cross-lagged analysis with 3 variables measured in 2 time points. I want to test the effect of two variables on one another (x1 and x2) and the moderating effect of a third (x3) variable on them. I have a theoretical reason to believe that the interaction between x1 in time 1 and x3 in time 2 influence x2 in time 2.

My question is:
Do I need to specify all the possible interactions in the usevariables command in every model that I am comparing, even though these variables don't appear in every model?

I am asking this because when I enter only the variables with theoretical significance to the usevariables command, I get great model fit indexes. When I ask for the same model but specify all the possible interactions (and there are a lot!!)in the usevariables command, I get very bad model fit indexes!

Thank you so much for your help in advance
 Bengt O. Muthen posted on Sunday, September 27, 2015 - 6:21 pm
I wonder if X3 is a dependent variable. Model fit for models with interactions involving DVs can be distorted. See e.g. Model 3 of Preacher et al (2007).

For general analysis advice you may want to contact SEMNET.
 Lior Abramson posted on Monday, September 28, 2015 - 2:36 am
Thank you for your reply.
I now realize that I may had a misunderstanding regarding the meaning of nested models and comparing chi fit tests, and I would like to make sure:

In order for a model to be nested within another model,or in order for two models to be comparable in a chi fit test, do the usevariable commands need to be identical in both models? or is the only requirement is that the nesting model will contain all the paths of the nested model+other paths?

Thanks!
 Linda K. Muthen posted on Monday, September 28, 2015 - 7:16 am
A nested model at a minimum must use the same set of dependent variables.
 Hae Yeon Lee posted on Tuesday, October 13, 2015 - 1:31 am
Dear Mplus team -

I have a question regarding SEM multigroup analysis DIFFTEST. I ran a multigroup analysis comparing the chi-square model fit between unconstrained vs. constrained model. Is it possible to output 95%CI of these chi-square model fit diff test statistics?

As I'm using a Mac-version of Mplus v.7, "SAVEDATA" command didn't seem to work. So I ran two models in separate runs and manually computed diff of chi-square test (e.g, diff of chi square, p values, etc).

However, I'm not sure if Mplus would allow me to output 95% CI corresponding to this DIFFTEST, which I will need to report in my manuscript.

Your advices will be much appreciated!!!


Thanks!
 Bengt O. Muthen posted on Wednesday, October 14, 2015 - 3:16 pm
No, this is not available.
 Johan Korhonen posted on Tuesday, April 19, 2016 - 12:51 pm
Hello

I have a question about comparing two multi group sem models. I compare a model that imposes no equality constraints on 3 structural paths with a model that constrains these 3 paths to equality to determine if these 3 paths really differ between the 2 groups (all three are significant in one group and non-significant in the other). The constrained model does not decrease model fit that much <delta> CFI=.002 and <delta> RMSEA=.001. However, when I test for differences in regression slopes (individually using (b1-b2)/sqrt(SEb1^2+SEb2^2)) I get a significant difference in 2 of the paths. Is it reasonable to conclude that in complex models (my model has df=600; N=1100 and many structural paths), small improvements might not be visible in overall fit indices? (I'm having a hard time finding some reference for this line of thought)
 Bengt O. Muthen posted on Friday, April 22, 2016 - 8:37 am
I don't think the test formula you show is right because if you have equalities across groups you have a violation of independence of parameter estimates. Instead, express the difference in Model Constraint or use Model Test. Or, you can use chi-square difference testing.

But you are probably right that the fit indices may not be able to pick up these differences.
 Ebrahim Hamedi posted on Sunday, July 03, 2016 - 6:55 pm
hello
there are two ESEM models, both have two factors. The difference is that two of the indicators are left out in Model 2 (as a way of shortening the scale).
I read the whole thread, and got the impression that the two models are not nested. Do you agree? How can I compare these two models? given that the observed variables are different in the two models, can I use BIC or AIC? if not, could you suggest any other way to compare them?

many thanks in advance
 Linda K. Muthen posted on Sunday, July 03, 2016 - 7:49 pm
Nested models must at a minimum have the same set of dependent variables. Factor indicators are dependent variables so the models would not be nested.
 Ebrahim Hamedi posted on Sunday, July 03, 2016 - 8:58 pm
Thanks for clarifying. some books suggest that BIC and AIC can be used to compare non-nested models. However, I read some comments from the mplus team that with different variables, the metric will be different for BIC and AIC.
So, can I conclude that with different dependent variables, BIC and AIC could not be used for model comparison?

Your help is much appreciated.
 Linda K. Muthen posted on Monday, July 04, 2016 - 2:57 pm
BIC and such statistics cannot be used with models that have different sets of dependent variables.
 Rick Borst posted on Tuesday, November 15, 2016 - 7:07 am
Dear professor Muthen,

I want to compare the significance of the addition of two latent variable interactions by looking at the loglikelihood difference. Is it then correct to run the basic model:

A BY A1-A6;
O BY q54_7 q54_8 q54_9 q54_10 q54_12 q54_13 q54_14;
I BY q54_3 q54_4 q54_5;
U BY U1-U6;

U ON Agecat;
U ON Tenure;
U ON Educat;
U ON Gender;
U ON I;
U ON O;
U ON A;

Versus the model with interactions:

I@1;
O@1;
A@1;

U ON Agecat;
U ON Tenure;
U ON Educat;
U ON Gender;
U ON O;
U ON I;
U ON A;

A;
OA | O XWITH A;
IA | I XWITH A;
U ON IA;
U ON OA;

I ask because the amount of free parameters is the same and the R square decreases.
 Bengt O. Muthen posted on Tuesday, November 15, 2016 - 5:19 pm
Yes, you can use a loglikelihood ratio chi-2 test for this.

I don't know why you add

I@1;
O@1;
A@1;

when you add the interactions. That throws off the test.
 Rick Borst posted on Tuesday, November 15, 2016 - 11:49 pm
Thank you for the quick Response. Is that only necessary if you want to create a loop-plot? I looked at the faq about latent interaction and I saw that it is done there as well.
 Bengt O. Muthen posted on Wednesday, November 16, 2016 - 5:45 pm
You shouldn't set the metric twice (loading and factor variance).
 Rick Borst posted on Wednesday, November 16, 2016 - 11:46 pm
Dear prof Muthen,

Oke so I should use an Asterix for the loading and @1 for the factor variance in the loop-plot as well as in the loglikelihood comparison?, ie:

A BY A1*-A6;
O BY q54_7* q54_8 q54_9 q54_10 q54_12 q54_13 q54_14;
I BY q54_3* q54_4 q54_5;
U BY U1-U6;

I@1;
O@1;
A@1;

U ON Agecat;
U ON Tenure;
U ON Educat;
U ON Gender;
U ON O;
U ON I;
U ON A;

A;
OA | O XWITH A;
IA | I XWITH A;
U ON IA;
U ON OA;
 Bengt O. Muthen posted on Thursday, November 17, 2016 - 1:42 pm
Try it and see if you get what you expect in the output.
 Sofie Henschel posted on Monday, February 20, 2017 - 7:53 am
Hello,
I am running several nested bayes models. Would it be appropriate to calculate the logLH from the BIC in order to compare the models with each other using a chi2 test?
Thanks in advance
Sofie
 Bengt O. Muthen posted on Monday, February 20, 2017 - 6:14 pm
It won't be distributed as a chi2, but perhaps it is a useful descriptive index.
 Sofie Henschel posted on Monday, February 20, 2017 - 11:23 pm
Thanks for your quick reply, Bengt. I have some follow-up questions.
1) What exactly would not be chi2 distributed - de LogLH, the BIC or the differences of the LogLhs (I am interested in the latter in order to calculate a test distribution)?
2) What distribution would the logLH difference follow if not chi2?
3) Is it possible to calculate a corresponding test distribution by bootstrapping?
4) Otherwise, which index would you suggest for deciding between nested bayes models?
Thanks again
Sofie
 Bengt O. Muthen posted on Wednesday, February 22, 2017 - 1:12 pm
1) The "Bayesian BIC" that is printed is based on the Bayes estimates, not ML estimates so the logL is not an ML-maximized logL but a logL computed with Bayes estimates. Due to this, taking the approach of a likelihood ratio chi-square difference test isn't right when based on this Bayesian BIC; it doesn't give a chi-square.

2) This is unknown.

3) Perhaps; that is a research question.

4) I would look at the significance of the extra parameters in the less restrictive model.
 John D Peipert posted on Tuesday, February 06, 2018 - 12:44 pm
Hello Professors,

I wonder if the DIFFTEST option is a good way to confirm that the H0 model is indeed nested in the H1 model. That is, will DIFFTEST run iff the H0 model is nested within the H1 model?

I am comparing two models that I believe are nested, and DIFFTEST is running with no error, but I'm wondering if there is any case where DIFFTEST would run when the H0 model was not actually nested within the H1 model.

As always, thanks in advance for your help.
 Bengt O. Muthen posted on Tuesday, February 06, 2018 - 3:12 pm
Difftest can't make sure that the models are nested. It checks only 2 necessary things:

H0 should have fewer parameters than H1

H0 should have a higher final F value than H1, where F is printed by TECH5 and refers to the fitting function that is optimized and where low values mean better fit to the data.
 John D Peipert posted on Tuesday, February 06, 2018 - 5:14 pm
Thanks so much for this quick response. Does Mplus have any way to check on whether models are nested?
 John D Peipert posted on Wednesday, February 07, 2018 - 10:59 am
Hi again Dr. Muthen,

I have a more pointed question than the one I've asked above. I'm experiencing a problem because I have two models that I believe should be nested, but their results suggested they are not. This is a multigroup CFA model with categorical indicators using WLSMV (theta param). Model H1 has invariant residual variances and loadings, and Model H0 just has invariant residual variances.

In theory these models should be nested, but the model chi-square for H0 is higher, as is the function minimum.

In H1, the loadings are constrained this way:
Group 1 Model:
f1 BY y1* y2-y4(L1-L4);
Group 2 Mode:
f1 BY y1* y2-y4 (L1-L4);

In H0, the loadings are constrained this way:
Group 1 Model:
f1 BY y1* y2-y4;
Group 2 Model:
f1 BY y1@1 y2-y4;

Do you know why these models do not appear to be nested in the results?
 Bengt O. Muthen posted on Wednesday, February 07, 2018 - 4:28 pm
No check for nestedness in Mplus (yet).

Your last message has me puzzled on 2 accounts. You say "but the model chi-square for H0 is higher" - that's how it should be for H0 because it is a stricter model. Also, your H0 model input does not have its loadings constrained. It is probably better if you send the relevant outputs to Support along with your license number.
 Louise Black posted on Monday, June 18, 2018 - 2:12 am
Dear Drs Muthen,

We are running competing CFA models with WLSMV, including correlated factors, bifactor and S-1 (a modified bifactor in which one less specific factor is specified so that g is defined by the items of this missing group factor). We understand how to nest the correlated factors model in the bfiactor solution but are having difficulty nesting the correlated model in the S-1. S-1 has more parameters than correlated but the fit function seems to be worse so the models cannot be nested with WLSMV. We have tried constraining the H1 model in 2 different ways but had no luck. We also have some residual correlations across all models and these seem to be causing a singular matrix.
1. Can the correlated model be nested in S-1?
2. How should we interpret the singular matrix warning when the residual correlations are included?
3. Why would the fit function be worse for a model with more parameters?

Thanks,

Louise and Margarita
 Bengt O. Muthen posted on Monday, June 18, 2018 - 9:50 am
Have a look at our new NESTED checking feature described in the paper under SEM:

Asparouhov, T. & Muthén, B. (2018). Nesting and equivalence testing in Mplus. Technical Report. May 16, 2018. (Download scripts).

See especially Section 4.2 on the bi-factor model.
 Louise Black posted on Tuesday, June 26, 2018 - 2:39 am
thank you very much for this- it was very helpful. Having read the paper and implemented the procedure we wondered if you could just clarify something:

Is the rule about the product of the two larger correlations a theoretical principle that should be followed even if the NET procedure suggests models are nested?

Our correlated model seems to violate this criterion (depending on whether correlated errors are included) so we were expecting that NET would find it to be not nested in all of our bifactor solutions (classical and S-1). This is the case even when we make sure all correlations are positive. Can we check that we should not perform the difftest for any of these models even though NET suggests the correlated model is nested in the classical bifactor?

many thanks,

Louise and Margarita
 Tihomir Asparouhov posted on Tuesday, June 26, 2018 - 8:36 am
The rule about the product of the two larger correlations is not a general rule - it only applies when you are looking at a 3x3 matrix equivalence to a one factor analysis model. The rule doesn't apply to other situations. If the NET procedure concludes that the models are nested I would trust that.
Back to top
Add Your Message Here
Post:
Username: Posting Information:
This is a private posting area. Only registered users and moderators may post messages here.
Password:
Options: Enable HTML code in message
Automatically activate URLs in message
Action: