Message/Author 

Anonymous posted on Thursday, December 09, 2004  4:49 pm



Hello, I am a rather new Mplus user. I am performing an SEM where I need to free covariances between some observed variables. How do I do this with Mplus? I haven't been able to find a solution in the user's guide. Thanks 

bmuthen posted on Thursday, December 09, 2004  5:18 pm



Look up the WITH command in the user's guide and you will find the answer. 

Anonymous posted on Friday, December 10, 2004  5:51 am



Thank you! It works perfect and much easier than Lisrel or EQS. 


Is there a trick where you can constrain a parameter to be positive, but not specify further restrictions? I tried to work with the model constraint option in the model statement, but could not make it work 


This is Tihomir's trick. The important thing is not to introduce a new parameter in the model. The exp function insures that the parameter is postive. In a factor model, let parameter p2 be the one to be positive: f by y1 y2 (p2); !create a dummy parameter p1 (any one will do): fdum by y1@0; fdum (p1); Model Constraint: p2 = exp (p1); 


I'm trying to use this trick, but I encounter some problems I could not find solutions for in the discussions. I need to constrain several parameters to be positive and others to be negative. Does this mean I have to create a dummyvariabel for every constraint? If I do so, MPlus returns an error "negative degrees of freedom". In fact due to the dummyvariables the number of latent variables is larger than number of observed variables. Haven't I understood the trick well? Is their a way to solve this problem? 


If I decrease the number of dummies I still get an error (PSImatrix not positive definite) involving the first dummyparameter. Standard errors cannot be estimated. 


You can try adding fdum WITH f@0; to the MODEL command. If this does not work, send your input, data, output, and license number to support@statmodel.com. For negative, use EXP. 


In Mplus, is there a way to contrain the variance of an endogenous latent variable to be equal to a specific numeric value, or to equal the variance of another endogenous latent variable (as in Ramona or Sepath)? 


Yes, you can do this using MODEL CONSTRAINT. See the Mplus User's Guide which is on the website for a full description. 


I can see how to constrain the variance of an endogenous latent variable's disturbance, but would like to place the constraint on the "whole" variance of the latent variable, not just its disturbance. 


You use Model constraint to express the whole variance as a function of the parameter estimates, for example using the "NEW" parameter feature, and then use that in constraints. 

John Hipp posted on Wednesday, September 03, 2008  6:07 pm



Hi I'm trying to constrain the variance of the disturbance of a single indicator to a specific value based on a reliability score. I'm trying to do this with your nonlinear constraints (as I'm going to estimate a more complicated model where I cannot just set it to a specific value). I'm using this code: !variance of the indicator error terms; psblk06(p1); pshisp06(p2); !variance of latent variable; ptbl06(p3); ptlat06(p4); ptbl06 BY psblk06@1; ptlat06 BY pshisp06@1; ptbl06 WITH ptlat06; MODEL CONSTRAINT: .823 = p3/(p3+p1); .503 = p4/(p4+p2); But something's going wrong in that it's constraining the variance of the latent variables to zero. Can you tell me where I'm going wrong? thanks! 


Please send the input, data, output,and your license number to support@statmodel.com. 


Within the Model Constraint Command, is there a way to specify BY and ON statements? 


No, linear and nonlinear constraints can be defined using the equal sign (=), the greater than sign (>), the less than sign (<), and all arithmetic operators and functions that are available in the DEFINE command with the exception of the absolute value function. 


When I tried to constrain error variance to SD^2 x (1alpha) and latent to indicator path to sqrt(1[SD^2 x (1alpha)]. I receive an error message that PSImatrix not positive definite. Subsequently, I tried the Tihomir's trick as recommended but I receive an error message informing me that “A parameter label or the constant 0 must appear on the lefthand side of a MODEL CONSTRAINT statement.” How should I go about resolving this issue? Thanks! 


Please send the full output and your license number to support@statmodel.com. 


I used the Model Constraint Command for significance testing. In the Savedata section, I saved the results of the analysis in a text file. In the Mplus output I can see results for the New/Additional Parameters. However when I try to find them in the text file, I don't see them nor are they listed in the Order of data at the end of the Mplus output. Are the Model Constraint results saved with the other estimates in the text file. If so where would I find them? If not, how can I save them? 


We do now save the NEW parameters using the RESULTS option of the SAVEDATA command. Please send your output and license number to support@statmodel.com if you can't see this. 

Matt C posted on Friday, May 07, 2010  5:16 pm



Hello, I'm still familiarizing myself with doing constraints in Mplus, so I apologize for the rudimentary question. I'm running an LGM using four indicators modeled with latent intercept, linear, and quadratic parameters. Results suggested that the covariance between the intercept and linear slope terms was nonsignificant; thus, I initially attempted to constrain it to zero as below: ... MODEL: i s q  perf1@0 perf@1 perf3@2 perf4@3; s WITH i (p1); MODEL CONSTRAINT: p1 = 0; ... Running that was giving me troubles (I was getting results indicating that parameter SEs could not be estimated because the model may not be identified). I later tried running it as below, which did work as intended. My question: what is it that the syntax above is doing that is different from that shown below? Thank you greatly for any information! ... MODEL: i s q  gpa1@0 gpa2@1 gpa3@2 gpa4@3; s WITH i@0; !MODEL CONSTRAINT: ! p1 = 0; Regards, Matt 


I have tried the two versions of fixing i WITH s at one and have no problem. I see that you have different outcomes. For further information, send your outputs and license number to support@statmodel.com. 


"We do now save the NEW parameters using the RESULTS option of the SAVEDATA command." I was using an older Mplus version. Is this feature only available in the new Mplus V6 release? 


I am not sure when we started doing this. I don't believe it is new in Version 6. 

Anna posted on Friday, June 25, 2010  2:12 am



Hello, I am using the model constraint command in order to constrain the correlations between the slopes and the levels of two latent variables in a latent growth model to be equal rahter than constraining the covariances to be equal. However, in the output, the correlations are not equal but i did not get any error message either. Do you know where I went wrong? Thank you very much for your answer!! 


Please send your full output and your license number to support@statmodel.com. 


Dear Linda, I'm trying to understand how Mplus is calculating the standard error of a model constraint. I have a model constraint wherein I am contrasting two correlated parameters estimates. For example: Model Constraint: New(diff_B1B2) . diff_B1B2 = B1  B2 . Is Mplus calculating the SE as Variance(B1)+Variance(B2)2*Covariance(B1B2)? Thanks! 


Mplus uses the Delta method which in the case of a linear function of random variables is the expression you give. See page 247 of Kendall and Stuart, The Advanced Theory of Statistics, Volume 1, Fourth Edition. 


Thank you! 


Hi, I have the following constraint in my script: 0=ch13^2+ch14^2+ch15^2+ch25^2+ch26^2+ch27^2+ch34^2+ch35^2+ch36^2+ch40^2+ch41^2+ch42^2+ch43^2+ch44^2+ch45^21; but it appears to be too long to fit in one line. I thought I'd try something like a=ch13^2+ch14^2+ch15^2; 0=a+ch25^2+ch26^2+ch27^2+ch34^2+ch35^2+ch36^2+ch40^2+ch41^2+ch42^2+ch43^2+ch44^2+ch45^21; but that does not work as I need to have a parameter label or a 0 on the lefthand side. Is there a trick to do this? 


Ok, got it. I did: xx by x1@0; xx*.2 (tr); tr=ch13^2+ch14^2+ch15^2; 0=tr+ch25^2+ch26^2+ch27^2+ch34^2+ch35^2+ch36^2+ch40^2+ch41^2+ch42^2+ch43^2+ch44^2+ch45^21; 


You can also let your expression continue over several lines  you don't need to have a semi colon on each line. 


thanks! 


Is there any difference between alpha constraintsfor example, (p1)and numeric constraintsfor example, (1)? When would a user want to use alpha (letter) constraints as opposed to numbers? Is it just that the alpha constraints allow for more complex mathematical statements to be written in the MODEL CONSTRAINT section? Is it correct that other than this, there is no difference? 


There is no difference. 

jas229 posted on Tuesday, July 10, 2012  12:55 pm



Dear Prof Muthen, We are trying to test a model with causal stress indicators to study the effect of a latent stress variable on two outcomes (negative and positive affect). In order to do this, we were hoping to follow Bollen & Davis's (2009) suggestion for testing underidentified models by using proportionality constraints. We need to include direct pathways between the causal indicators and the outcome variables and then impose proportionality constraints on these paths that assume that the influence of the causal indicators is mediated through a casual latent variable. We tried imposing these constraints using the model constraints command by naming the proportions between the paths that need to be constrained and then equating them to be equal. However, we get this error: NO CONVERGENCE. SERIOUS PROBLEMS IN ITERATIONS. ESTIMATED COVARIANCE MATRIX NONINVERTIBLE. CHECK YOUR STARTING VALUES. Could you please guide us about the syntax we need to impose these constraints in order to run the model? Thank you, Deepika 


Please send your output and license number to support@statmodel.com. 


Dear prof Muthen, Recently I read the paper by van de Schoot et al (2010) on Testing Inequality Constrained Hypotheses and I am interested in the procedure implemented in MPLUS to estimate models with inequality constrained parameters. I have already used it, but I would like to know more about how it works. The procedure is briefly described in van de Schoot et al, and I am wondering if you have a paper or technical report that described this procedure with more detail. Thank you very mush for your help. Best regards, Francisca 


Dear Fransisca, There are two ways of testing for inequality constraints in Mplus: (1) using bootstrapping and (2) using Bayes Factors. There are two publications describing the procedure for (1):  Van de Schoot, R., Hoijtink, H. & Deković, M. (2010). Testing Inequality Constrained Hypotheses in SEM Models. Structural Equation Modeling, 17, 443–463.  Van de Schoot, R. & Strohmeier, D. (2011). Testing informative hypotheses in SEM Increases Power: An illustration contrasting classical hypothesis testing with a parametric bootstrap approach. International Journal of Behavioral Development, 35: 180190 Two papers describing the second method are in press:  Van de Schoot, R., Hoijtink, H., Hallquist, M. N., &Boelen, P.A. Bayesian Evaluation of inequalityconstrained Hypotheses in SEM Models using Mplus. Structural Equation Modeling.  Van de Schoot, R., Vehoeven, M., & Hoijtink, H. Bayesian Evaluation of Informative Hypotheses in SEM using Mplus: A Black Bear story. European Journal of Developmental Psychology. If you want a copy of these papers please send me an email and I will be happy to give you a copy of both papers. Best regards, Rens van de Schoot 


Hello, Can I impose a constraint using MODEL CONSTRAINT that maximizes a correlation between two factors in the model? (I see other constraints, such as that a parameter be less than or greater than a certain value. So perhaps there is a way to constrain the paramater such that the correlation is maximized?) Thanks, Lisa 


This is not a function in MODEL CONSTRAINT. 

Steve posted on Wednesday, June 26, 2013  9:37 am



Hello, I am a relatively new user to Mplus. Basically  I need to know how to constain the correlation between two latent variables to 1. That is, I am needing to conduct a Chisquare difference test using a CFA of two latent variables F1 and F2. Model 1 with the correlation between F1 and F2 freely estimated vs Model 2 with the correlation between F1 and F2 set to 1 (with a 1 df change). Model 1 works fine Model: F1 by X1 X2 X3; F2 by X4 X5 X6; For Model 2, I am using Model: F1 by X1 X2 X3; F2 by X4 X5 X6; F1 with F2@1; However, the output for Model 2 does not provide model fit nor STDYX statistics. Many thanks. 


Why do you need STDYX for Model 2  aren't you just interested in the test result? Note also that F1 WTIH F2 is not a correlation, but a covariance given that the factor variances are not 1. To make into a correlation you have to fix the metric by fixing the factor variance at 1 instead of the default of the first loading at 1. Btw, you can get your test by running only Model 1 and using Model Test to see if the correlation = 1. 

Steve posted on Wednesday, June 26, 2013  1:35 pm



Dear Bengt, Thank you for your quick response. Yes, I don't need STDYX  just noticed it was not there for comparison and to check that correlation between F1 and F2 was in fact 1. If I understand correctly, I should use: Model: F1 by X1* X2 X3; F2 by X4* X5 X6; F1@1; F2@1; Then for Model 2 use: Model: F1 by X1* X2 X3; F2 by X4* X5 X6; F1@1; F2@1; F1 with F2@1; I also tried the Model Test version as you suggested, but was unable to get it to work do to the wrong input. I consulted the manual, but cannot figure out how to do this. Many thanks if you could confirm! 


Looks correct. Model test would say: 0 = p11; where p1 is the label for F1 WITH F2. 

Steve posted on Thursday, June 27, 2013  3:34 am



Dear Bengt, Thanks so much for your help! 


Hello, I was reading up on parameter constraints in SEM and was unclear on why one would want to fix a parameter to 1 or 0 as follows: y on x1 x2@1 x3 or y on x1 x2@0 x3 Would you be able to point me to any relevant literature? Thanks. 


The constraints in your model should be based on theory, your hypotheses, and your research questions. Usually, paths are left out of a model not fixed to a value other than zero. 


Hello, I am calculating the within and between variances for a group of variables using TWOLEVEL BASIC. For one of the variables the between variance is close to 0 so the model doesn't converge. I still want to know the within variance. Mplus suggests that I fix the variance and corresponding covariance to 0. How can I do this with only one variable? Thanks! 


Just fix the between variance. 


Thanks for your answer! That's what I've been trying to do but I still get the same message. I wonder if I'm doing it wrong. This is the syntax I've used: usevariables x ; CLUSTER = y; ANALYSIS: TYPE= TWOLEVEL BASIC; Model: %BETWEEN% x@0; Output: standardized; sampstat; 


Please send the output with the original message and your license number to support@statmodel.com. 


Hello, I am conducting a path analysis and one of my major study hypotheses is that predictor A accounts for a significant portion of the overlap in variance between variables X and Y. I found that A does predict both X and Y, but I want to show that including A in the path model reduces the covariance between X and Y. Is it possible to explicitly test this and if so, what would be the best way to go about doing this? Many thanks 


You can look at the residual covariance between x and y and compare it to the model estimated covariance. 


Linda, Great, thank you so much for your reply. In regards to the question I posted above, others have suggested to me that in model 1, I look at the model estimated correlation between X and Y when A is not included as a predictor. Then, in Model 2, I could include A as a predictor and constrain X and Y's correlation to be the same as in Model 1. If this constraint does not significantly decrease model fit, then I could assume that A does account for a significant portion of the overlap in variance between X and Y. Would this test be possible (I couldn't figure out if it is even possible to constrain the correlation between two variables to a particular value)? Or do you recommend what you've suggested above? Thank you again. 


Oops, a correction to the question I just posted: Linda, Great, thank you so much for your reply. In regards to the question I posted above, others have suggested to me that in model 1, I look at the model estimated correlation between X and Y when A is not included as a predictor. Then, in Model 2, I could include A as a predictor and constrain X and Y's correlation to be the same as in Model 1. If this constraint DOES significantly decrease model fit, then I could assume that A accounts for a significant portion of the overlap in variance between X and Y. Would this test be possible (I couldn't figure out if it is even possible to constrain the correlation between two variables to a particular value)? Or do you recommend what you've suggested above? Thank you again. 


This type of question is more suitable for a general discussion forum like SEMNET. 


Dear Mplus team, I am running an autoregressive cross lagged model and would like to test for gender difference. I am not sure how to add a constraint on gender and to also see if it is significantly different from the free data? NAMES = Famid sex w2age w3age w4age w2opp w3opp w4opp w2del w3del w4del w2phy w3phy w4phy w2dep w3dep w4dep; CLUSTER= Famid; GROUPING = sex (1 = male 2 = female); USEVARIABLES = Famid sex w2opp w2del w2dep w3opp w3del w3dep w4opp w4del w4dep; MISSING = ALL(999); ANALYSIS: TYPE = COMPLEX; ESTIMATOR = MLR; ITERATIONS = 1000; CONVERGENCE = 0.00005; MODEL: w3opp ON w2opp w2del w2dep; w3del ON w2opp w2del w2dep; w3dep ON w2opp w2del w2dep; w4opp ON w2opp w2del w2dep w3opp w3del w3dep; w4del ON w2opp w2del w2dep w3opp w3del w3dep; w4dep ON w2opp w2del w2dep w3opp w3del w3dep; w2opp WITH w2del; w2opp WITH w2dep; w2dep WITH w2del; w3opp WITH w3del; w3opp WITH w3dep; w3dep WITH w3del; w4opp WITH w4del; w4opp WITH w4dep; w4dep WITH w4del; OUTPUT: MOD STAND; Many thanks 


If you want to test a coefficient for males versus females, you need to mention the parameter in the groupspecific parts of the MODEL command. See the discussion of multiple group analysis in Chapter 14 of the user's guide. 


Thank you for highlighting this as I forgot to ask for this as well. I had a look at the chapter on multiple group analysis, by what I understand is that the chisquare test is to be used to get an idea of the significance I might be wrong here in interpreting the information. I'm not sure where I can specify the tests and what codes to use in the model. In addition, I want to test to see whether the constrained model is significantly different from the model where male and female is not constrained. Is there a way around this? Many thanks 


When you label classspecific parameters, you can test if they are different using the Wald test of MODEL TEST or chisquare difference testing of the model with the parameters equal versus the model with parameters not equal. 


Thank for the prompt response. I have tried to use chisquare test for the data but I have this warning: ""The chisquare value for MLM, MLMV, MLR, ULSMV, WLSM and WLSMV cannot be used for chisquare difference testing in the regular way"" Based on this model: GROUPING = sex (1 = male 2 = female); CLUSTER= Famid; ANALYSIS: TYPE = COMPLEX; ESTIMATOR = MLR; ITERATIONS = 1000; CONVERGENCE = 0.00005; MODEL: w3opp ON w2opp; w3opp ON w2del; w3opp ON w2dep; w3del ON w2opp; w3del ON w2del; w3del ON w2dep; MODEL: MODEL male: w3opp ON w2opp (1); w3opp ON w2del (2); w3opp ON w2dep (3); w3del ON w2opp (4); w3del ON w2del (5); w3del ON w2dep (6); MODEL female: w3opp ON w2opp (1); w3opp ON w2del (2); w3opp ON w2dep (3); w3del ON w2opp (4); w3del ON w2del (5); w3del ON w2dep (6); OUTPUT: MOD STAND; I am also not sure if the codes are correct above to test for the constraint on sex. I am not familiar with the Wald test, can you please give an example with my variables so that I can use this as the base to build my model. Many thanks 


The Wald testing is explained under MODEL TEST in the User's Guide. 

Back to top 