Fitting a measurement model in a mult... PreviousNext
Mplus Discussion > Confirmatory Factor Analysis >
Message/Author
 barbara mark posted on Friday, September 22, 2000 - 9:12 am
I have a 4-indicator measurement model that performed well in the "between" analysis, but not in the "within" analysis. I either get a model that will not converge, or I get negative signs on some of the indicators. I have tried setting lamdas to 1 on different indicators, I have tried different start values for the indicators, and I have tried different combinations of correlated measurement errors among the indicators, but nothing seems to be working. Any suggestions would be appreciated.
 bmuthen posted on Saturday, September 23, 2000 - 2:34 am
Unless you already tried this, a regular analysis of the pooled-within sample covariance matrix printed by Mplus is a good way to examine the within structure. Here, usual procedures for finding a well-fitting model can be used. Since there are only 4 indicators, it is easy to look at the sample covariances and see if negative loadings should happen. Check also that no variable has a considerably larger variance than the others.
 Anonymous posted on Tuesday, May 25, 2004 - 7:12 am
I am tring a two-level CFA model with four latent vaiables which are measured by 3 or 4 indicators each. As suggested by Muthén 1994, in the first step, I am runing a pooling single-level model, but several between-level indicators have more than 1 Lemda and several are negative. Followed by the output suggestion, I fixed the highest loading factor of each latent variable =1, but the ouptput shows: NO CONVERGENCE. NUMBER OF ITERATIONS EXCEEDED. So what's the probem and how should I solve it?

in addition, the second setp in checking ICC goes well, and the third step in running the pooling within in model also comes out results reasonable. But the final step for 2-level analysis comes out a FATAL ERROR
THERE IS NOT ENOUGH MEMORY SPACE TO RUN THE PROGRAM ON THE CURRENT
INPUT FILE. THE ANALYSIS REQUIRES 4 DIMENSIONS OF INTEGRATION RESULTING
IN A TOTAL OF 50625 INTEGRATION POINTS. THIS MAY BE THE CAUSE
OF THE MEMORY SHORTAGE. YOU CAN TRY TO FREE UP SOME MEMORY BY CLOSING
OTHER APPLICATIONS THAT ARE CURRENTLY RUNNING. ANOTHER SUGGESTION IS
CLEANING UP YOUR HARD DRIVE BY DELETING UNNECESSARY FILES.

But currently, my computer has space available about 20G, is it still not enough, or should I make some correction of my model.

Thank you very much!
 bmuthen posted on Tuesday, May 25, 2004 - 8:05 am
The failure of your first step indicates that you should modify your model, perhaps using exploratory factor analysis. Given the fatal error message in your final step, it sounds like you have specified some of your indicators as categorical. The 4 steps in Muthen (1984) were intended for continuous outcomes. The 4-dimensional integration in the final step is space demanding because the combination of the many integration points and a large sample size creates a huge space need. You can reduce to say 10 integration points per dimension by using the option integration = 10 (see User's Guide). But first work on step 1.
 Anonymous posted on Wednesday, May 26, 2004 - 9:19 am
Thanks for your quick answer.
Yes I have several cateogrical indicators at between level. But could I still follow the 4-step just by figuring out that some of them are categorical in the variable command?

And I tried the EFA of pooling model by exploring 5-6 factors, but I got the following results:

THE INPUT SAMPLE CORRELATION MATRIX IS NOT POSITIVE DEFINITE.
THE ESTIMATES GIVEN BELOW ARE STILL VALID.
PROBLEM OCCURRED IN EXPLORATORY ANALYSIS WITH 5 FACTOR(S).
THE PROGRAM HAD PROBLEMS MINIMIZING THE FUNCTION.
TOO MANY FACTORS HAVE PROBABLY BEEN USED.

But based on theory (also because 2-level analysis demands seperate factors at both levels), I need at least five factors in the pooling models. So any suggestion?

Thanks again.
 bmuthen posted on Thursday, May 27, 2004 - 11:07 am
When you say "pooling model" I think you mean Step 1 analysis of the total covariance (correlation) matrix, referred to as S_T in my paper; the word "pooling" is a bit confusing since analysis of the pooled-within covariance matrix is Step 3.

If you hypothesize 4 within factors and 1 between factor, this does not mean that you necessarily would get 5 factors in Step 1; you may get less because within and between factors get a bit confounded. The EFA error message suggests that less than 5 factors should be attempted. And, even if theory says you should get 4 within factors does not mean that your factor analysis confirms that. Less than perfect validity of the measurements may distort the intent of the theory. It is important to settle these Step 1 matters before continuing.
 Anonymous posted on Friday, May 28, 2004 - 12:33 am
Yes, in the setp 1, I am doing S_T. And I exactly got what you mentioned problem of "confounded" not only at the two levels, also confounded with the outcome indicators (the outcome variable is also a latent variable with 3 continuous indicators), so I have no idea what's the use of doing step 1. I cannot get the valide factors at both levels and even the indicators of independent variables are confounded with that of outcomes. Does it mean that my model is problematic?

I tried to just drop off some misspecifications resulted by step 1 and go to step 4 (step 2 and 3 works good), but step 4 does not work. I also try S_B, it seems that problem comes from the S_B. so should I also run EFA at between?

Thank you very much for the time and helps.
 bmuthen posted on Saturday, May 29, 2004 - 10:53 am
Regarding the question in your first paragraph - yes, if the loadings of your outcome indicators don't behave well your model is most likely problematic. If you haven't already, I would go back and look at the EFA here.

Regarding your question in the last paragraph, EFA on the estimated Sigma(B) is useful if you have many clusters (say > 50) - see Muthen (1994).
 eric duku posted on Thursday, January 12, 2006 - 11:54 am
I am running a 2-level CFA with one latent factor and 26 items. I have run successfully the first 3 steps as suggested by Muthen (1984). On the final 2-level model, I get the following error message

THE LOGLIKELIHOOD DECREASED IN THE LAST EM ITERATION. CHANGE YOUR MODEL
AND/OR STARTING VALUES.

THE MODEL ESTIMATION DID NOT TERMINATE NORMALLY DUE TO AN ERROR IN THE
COMPUTATION. CHANGE YOUR MODEL AND/OR STARTING VALUES.

I used starting values from the between and within models to no avail. My next option is to try EFA to see what I get.

Any help would be greatly appreciated! :-)

Thanks,
Eric
 Linda K. Muthen posted on Thursday, January 12, 2006 - 2:20 pm
If you have not done EFA on the pooled-within and estimated sigma between matrices, I would start there.
 eric duku posted on Friday, January 13, 2006 - 9:01 am
Thanks, Linda!

I appreciate your quick response.

I'll do that and keep you posted.

Eric
 eric duku posted on Friday, January 20, 2006 - 12:29 pm
Hi Linda!

Sorry for the delay in keeping you posted...your suggestion worked...thank you! :-)

I appreciate your help.

All the best!
Eric
 Beth Livingston posted on Thursday, September 14, 2017 - 2:28 pm
Hello! I am fitting a measurement model, and then conducting an SEM, for a measure that has three latent factors (with approximately 6 indicators each), and one second order factor--modeled at both the within and the between.

When I run just the measurement model, it looks good--but when I add covariates at level 1, some of the factor loadings at the between level become non-significant, the level 2 fit decreases, etc. Is this because at the between level, we are modeling the random intercepts of each item as factor indicators, and so as level 1 multicollinearity is introduced, or more level 1 variance is explained, those random intercepts change? Or is this more likely a mistake in my syntax? In other words, is this a feature, or a bug?

thanks!
 Bengt O. Muthen posted on Thursday, September 14, 2017 - 3:59 pm
Adding covariates on level 1 changes what the random intercepts represent and therefore changes level 2 results. This is well-known in the multilevel literature.
 Katherine Paschall posted on Friday, November 15, 2019 - 7:32 am
We’re running a one factor second-order multilevel CFA with categorical indicators. We saw signs of the second-order factor indicators’ loadings flip: A_w, B_w, C_w positive at the within level becoming negative at the between level (A_b, B_b, and C_b) and D_w negative at the within level becoming positive at the between level (D_b).

The model converged only when the first second-order indicator factor loading was freed and the second-order latent factor variance was set to 1.

%WITHIN%
A_w BY aa ab ac ad;
B_w BY ba bb bc bd;
C_w BY ca cb cc cd;
D_w BY da db dc dd;
H_w BY A_w* B_w C_w D_w ;
H_w@1;

%BETWEEN%
A_b BY aa ab ac ad;
B_b BY ba bb bc bd;
C_b BY ca cb cc cd;
D_b BY da db dc dd;
H_b BY A_b* B_b C_b D_b ;
H_b@1;

Looking at the model results, we had a pretty good fit (CFI = 0.953, TLI = 0.948, RMSEA = .020, Chi-square = 1271.159, DF = 542, P-Value = 0.000). We have 26 clusters with average size of 135.577 and ICC ranging 0.014-0.194.

I tried fixing the small negative residual variance to zero, but the model did not converge.

We’d love to have your insight into making sense of, and suggestions you might have for the sign changes that occurs at the within and between level for the second-order factor loadings. Thank you!
 Bengt O. Muthen posted on Friday, November 15, 2019 - 2:08 pm
Maybe the within and between loadings are not statistically significant. You can test that by adding a run where you hold all loadings equal across levels, letting the between-level factor variance be a free parameter. Then do a LR chi-2 diff test.

The fact that you got convergence only when setting the metric in the factor variance suggests that using the first factor indicator as the one with loading fixed at 1 was not a good choice - perhaps due to an insignificant loading.
 Katherine Paschall posted on Tuesday, November 19, 2019 - 8:06 am
Thank you for your response. Can you give us more details on how to run a LR chi-2 diff test when working with a multilevel CFA using WLSMV as the estimator? We tried running a normal chi-square difference test and this is the warning message we got.

“The DIFFTEST option is not available for TYPE=TWOLEVEL.
Request for DIFFTEST will be ignored.”
 Tihomir Asparouhov posted on Tuesday, November 19, 2019 - 10:15 am
That message is correct - it is not available. You can use the Model test command for comparative testing. See page 772 of the User's Guide. You would have to switch to ML estimation for LRT but most likely the model you are interested is not doable in ML. For most comparative tests the Model test command will work.
Back to top
Add Your Message Here
Post:
Username: Posting Information:
This is a private posting area. Only registered users and moderators may post messages here.
Password:
Options: Enable HTML code in message
Automatically activate URLs in message
Action: