Message/Author 

Anonymous posted on Wednesday, April 23, 2003  1:24 pm



I am planning on analyzing a structural equation model that contains both latent variables and observed variables in the structural part. My question is how to estimate the measurement model. Do i include the variables (by putting them in the "usevariables" syntax) that will be used as observed variables (not as indicators of a latent construct), or leave them out when estimating the measurement model? Thanks. 


All observed variables that are part of the analysis whether they are used as factor indicators or not should be included on the USEVARIABLES list. I think this is what you meant. If you are developing the measurement model as a first step, then only the observed variables that are part of the measurment model should be on the USEVARIABLES list for this step. 


I have a question regarding the measurement part of a structural equation model. How to introduce exogenous variables (covariates) in the measurement equations in MPLUS? 

bmuthen posted on Friday, November 26, 2004  6:24 am



I assume you mean that a certain y item is directly influenced by an x covariate, which indicates lack of measurement invariance? If so, you simply say y on x; 


Hi, I am trying to establish measurement invariance in my data to proceed with a Latent Class SEM. When i ran a 2 class CFA, i fixed one of the factor loadings to unity in the overall model and then allowed the rest of the loadings to vary across classes. However, i noticed that Mplus only estimated these freed loadings for the first latent class and not for the second. The results are to my satisfaction (i.e., the no classes model performs better than the 2 class solution). But i wanted to make sure if i have to estimate the loadings for the second class as well and then look at the results. Here's my input TITLE: CFA for measurement invriance analysis data: FILE IS c:\raw59.dat; variable: NAMES ARE y1y54 vfao ie2o pcomo muno culdo y60y77 iedep mun dyn mult; USEVARIABLES ARE y1y2 y4y6 y7y12 y60y77; CLASSES = c(2); ANALYSIS: TYPE = MIXTURE; starts=50 5; MODEL: %OVERALL% d BY y1@1; s BY y5@1; sat BY y8@1; longpot BY y11@1; mun BY y61@1; dyn by y65@1; mult by y68@1; iedep by y74@1; %c#1% d BY y2; s BY y4 y6; sat BY y7 y9; longpot BY y10 y12; mun BY y60 y62y63; dyn by y64 y66 y67; mult by y69y72; iedep by y73 y75y77; OUTPUT: TECH1 TECH8; 

bmuthen posted on Monday, February 28, 2005  8:36 pm



The overall model should include the loadings for all the indicators. This is the model that holds for all classes. Then, the classspecific loadings for class 1 can be given as you did. 


Hello Dr.Muthen, Thanks that helps. My objective is to compare regression coefficients across latent classes. For this based on Steenkamp and Baumgartner 98 i am testing for invariance of factor loadings across the classes. Should i test for invariance of error variances as well in the measurement model. Any thoughts on this. regards 

BMuthen posted on Tuesday, March 01, 2005  10:59 am



I don't regard invariance of error variances as a critical feature. Some disciplines, however, are interested in reliability matters and would therefore test invariance of error variances. 


Hello Dr.Muthen, Thanks for the guidance. regards 


I am trying to establish structural invariance across groups in a Multigroup CFA. I have already tested the equality of the factor loadings on the different factors. How do I rerun it to compare the structural paths to be equal? When I try to enter it in my 2nd model command it comes back with the same results as if nothing were free to estimate. Is there a different command for testing the relationship between the latent variables across groups? TITLE: Multigroup CFA DATA: FILE IS 'P:\MultigroupCFA10112005.dat'; VARIABLE: NAMES ARE threat1 threat2 threat3 threat4 threat5 threat6 slap1 slap2 slap3 slap4 slap5 slap6 beat1 beat2 beat3 beat4 beat5 beat6 knife1 knife2 shoot1 shoot2 sample site sex race age; USEVARIABLES ARE threat1  shoot2 sex; MISSING ARE ALL (9); GROUPING IS sex (0=female 1=male); MODEL: f1 BY threat6 slap6 beat6; f2 BY threat5 slap5 beat5; f3 BY threat2 threat3 slap2 slap3 beat2 beat3; f4 BY threat1 threat4 slap1 slap4 beat1 beat4; f5 BY knife1 knife2 shoot1 shoot2; f1 with f2 f3 f4 f5; MODEL male: f1 with f2; OUTPUT: STAND MOD; 


Equalities are specified by placing the same number in parentheses following the parameters that are to be held equal. So for example: MODEL: f1 WITH f2 (1); would hold the covariance of f1 and f2 equal across all groups. See Chapter 16 of the Mplus User's Guide for a description of the special language for equalities and see Chapter 13 the user's guide for a description of special issues for multiple group models. 

Anonymous posted on Tuesday, December 20, 2005  10:54 am



If there are numberous variables to be held equal across the groups, would the coding look like this: model f1 with f2 f3 f4 (1); Does this make it so that f2, f3, and f4 are equal merely ACROSS groups, or are they also set to be equal WITHIN groups? f2 = f3 = f4... thanks for your help  this discussion board is saving my sanity! 


I believe that your statement, f1 WITH f2 f3 f4 (1); in the MODEL command of a multiple group analysis will result in the covariance between f1 and f4 being held equal across groups. f1 WITH f2f4 (1); would result in the covariances between f1 and f2, f1 and f3, and f1 with f4 being held equal to each other and equal across groups. f1 WITH f2 (1) f3 (2) f4 (3); would result in the covariance between f1 and f2 being held equal across groups, the covariance between f1 and f3 being held equal across groups, etc. Equalities in multiple groups is described in Chapter 13. 

Jo Brown posted on Monday, January 28, 2013  5:33 am



Hi I would like to test the measurement model for 3 latent variables which I use to assess mediation. To do so I run a combined measurement model: y BY y1 y2 etc; x BY x1 x2 etc; m BY m1 m2 etc; I found a good fit and no modindices suggestion. I then run the SEM model y BY y1 y2 etc; x BY x1 x2 etc; m BY m1 m2 etc; y on x m; m on x; this model produced the exact same goodnessoffit indices; I am not sure why??? 


The fit does not change because the structural model is justidentified. 

Jo Brown posted on Tuesday, January 29, 2013  12:57 am



thanks  so there is not need to report both measurement and structural fit, right? 


There is no fit to report for the structural part of the model because it is justidentified. 

Jo Brown posted on Tuesday, January 29, 2013  10:28 am



Thanks again! 


I have recently created a 45 item instrument. EFA results and theory lead me believe that is is best represented by a 3 factor model. A sample of 750 was used for the EFA and a different sample is being used for the CFA. Could you give me your advice and/or suggestion of an article that will help me become more familiar with to what extent and how to use mod indices to make the model a better fit, while keeping theory in consideration? I have your MPlus book and Klein's SEM book, but am looking for something a little simpler and more specific to this question as I intend to use the latent constructs within structural modeling in the near future. 


I would listen to the Topic 1 course video. We go through using EFA and CFA in detail. 


Thanks so much! 

Jenny L. posted on Tuesday, May 07, 2013  2:54 pm



Dear Professors, I have a scale (with only 5 items, which are expected to load on the same factor)used at 2 time points. I tested factorial invariance by using the example model 5.26 command. Model fitness was fine based on CFI(=.971) and TLI(=.955), but RMSEA was .102 (90%CI: .074, .132), and pvalue of chi square was .0000. I was wondering if there are indices in the output I can refer to further improve the model. Thank you in advance for your advice. 


Modification indices can be used to see where model fit can be improved. See the MODINDICES option of the OUTPUT command. 

Jenny L. posted on Tuesday, May 07, 2013  7:32 pm



Thank you Prof. Muthen. I'd like to follow up with the question above. My understanding is that model modification indices only suggest the paths that should be added. Are there things I can look at in the output to determine whether a particular item should be removed from the factor analysis to improve factorial invariance across time? Thank you again for your help. 


You could do an EFA at each time point to see if the items behave as expected. 

Jenny L. posted on Tuesday, May 07, 2013  10:02 pm



Thank you, Prof. Muthen! 

dvl posted on Thursday, January 30, 2014  6:33 am



I have a number of questions: (1) For example: If I have a structural equation model with two scales “worktofamily conflict” and “familytowork conflict” (these are my latent concepts) and two exogenous variables “gender” and “age”, should I already include the exogenous variables “gender” and “age” in the measurement part of the model? Or should I just study the correlation between the two latent concepts WTF en FTW and the internal consistency of the different scales in the measurement part of the model? In this scenario, the covariates gender and age are only brought in the model from the moment the structural part of the SEM model begins? So summarized my question is Should I include the covariates gender and age already in the measurement model or not? (2) I learnt in SAS that we fix the variances of the factors to 1 when performing a measurement model (or confirmatory factor analysis) in order to solve the scale problem. How is the scale problem solved in mplus when doing a confirmatory factor analysis? In the same way as we do it in SAS, or? (3) In the structural part, I saw that the first factor loading is fixed to 1, I would rather fix the largest factor loading to 1, how can I change this assumption? 


You will want to study Topic 1 on our website (handout and video) which discusses the questions you raise. For (1) you want to look for "MIMIC" modeling. For (2)  (3) Mplus fixes the first loading as the default but you can change that to fix any loading or the factor variance (see User's Guide on how to do that). 

dvl posted on Thursday, January 30, 2014  2:14 pm



Thanks for answering! 1. One more question, in fact I do not understand why I should use a mimic model? As I have read, a mimic model study the influences of gender on wtf, should I not just study the correlation between gender and wtf in the measurement part of my model? 2. Another question, in the second message on this forum there is mentioned "if you are developing the measurement model as a first step, then only the observed variables that are part of the measurement model should be included in the use variables list". What is meant with "observed variables that are part of the measurement model"? Are these only the factor indicators of WTF and FTW in my example? And not gender and age? 


1. You use the MIMIC model to look for direct effects between the covariates and the factor indicators. Significant direct effects represent differential item functioning. Please see the Topic 1 course handout and video on the website where measurement invariance and population heterogeneity are discussed for the MIMIC model and multiple group analysis. 2. Yes, this means just the factor indicators. 

Tom Bailey posted on Thursday, July 17, 2014  5:26 pm



Dear Dr Muthen I was hoping you might be so kind as to offer me some guidance into the construction of a measurement model in Mplus. I have a structural model that features: one exogenous latent factor with four indicators, one exogenous observed variable (dichotomous) and one endogenous observed variable (also dichotomous). If I was to do a measurement model beforehand is it unnecessary to also include the observed variables in order to see how the model fits as a whole(as below). VARIABLE: NAMES ARE Gender CARINT Q9_a Q9_b Q9_c Q9_d; CATEGORICAL = Gender CARINT Q9_a Q9_b Q9_c Q9_d; MODEL: FINREW BY Q9_a Q9_b Q9_c Q9_d; Gender WITH CARINT; Gender WITH FINREW; CARINT WITH FINREW; ANALYSIS: ESTIMATOR=WLSMV; PARAMETERIZATION=THETA; OUTPUT: standardized; modindices; TECH4 ; Many thanks Tom 


I would run the model first without the WITH statements to fit the measurement model. When adding gender and carint, I would use them as covariates in an ON statement. 

Ejlis posted on Wednesday, October 28, 2015  3:37 am



Dear Muthen, Is it ok to have indicators in opposite directions (e.g., positive and negative parenting) on the same latent construct? Thank you, 


This is fine. 

Lucija posted on Tuesday, March 21, 2017  6:05 am



Dear Dr. Muthen, I am testing a measurement invariance of a construct across three countries. MI suggest that I should add an error covariance in one of the groups. If I add this to the model, it is estimated, but is almost zero and nonsignificant. Surprisingly, MI still suggest that I should add it to the model. Could you please tell me what can cause this and how to solve it? Thank you in advance! Best wishes, Lucija 


We need to see your 2 outputs  send to Support along with your license number  so we can give you a proper answer. 

Lucija posted on Thursday, March 23, 2017  7:32 am



Thank you! I've just sent it. Kind regards, Lucija 


Dear professor Muthen, I have two latent variables and three time points. From previous models, we know that each latent variable has five indicators. Nevertheless, initial fit is not satisfactory. Thus, after reading some handouts, I have introduced correlated errors over time in the model, and the output provided a better fit. However, I still need to improve measurement model fit. As suggested in several comments in statmodel website, I have taken a look at modification indices and I noticed crossloadings. How can I take into account this in Mplus, without changing BY syntax (from previous evidence I should not modify the definition of the involved latent variables), in order to improve model fit? Thank you. Kind regards 


You have to use BY. So with a crossloading (assuming 2 factors f1 and f2) for item y1, say f1 by y1; f2 by y1; 


Thank you for your kind reply. I have read that not including the covariances between latent variables will result in biased cross loadings. In this situation should I specify free covariances between latent variables or should I fix them to a specific value? Is there an alternative way to improve model fit without introducing crossloadings? Thank you. 


I recommend that you send these general analysis questions to SEMNET. 

Back to top 