Message/Author 


hi linda and hi bengt! i have a question concerning fixed parameters within formative (or causal) measurement models (FMM) like SES. my aim is to perform a SEM with exogenous FMM and endogen ordinal reflective measurement model (RMM). the model fit of WRMR = .907 looks appropriate, and comparable parameter estimates structure were obtained by a PLSmodel. my problem for now is the following: apparantly the FMM regression weights are very sensitive due to the fixing value. the following model provide a value of .01 as a nice approximation. is there a ruleofthumb for the fixing value within FMMs? here is the corresponding model: categorical: pride1 pride2 pride3; model: repu by; repu on repu1@0.01 repu2 repu3; repu@0; pride by pride1 pride 2 pride3; pride on repu; thanks in advance! 


If you set the metric using .01 instead of 1, then the FMM regression weights will be changed. Their relative size should remain constant. 


linda, thank you for the immediate reply. i have an additional question concerning the 'statistical fit' of formative indicators with a potential regard to scale purification. let's assume one has no access to PLS (within a theory 'building'approach), then there would be no guidance for setting the metric via a discrete indicatorweight (in order to obtain trustful tvalues). is there an alternative to the PLSpre'testing' respectively would you denote this procedure as appropriate? cheers, christian 


The choice of constant or variable for setting the metric does not affect the model fit  the same restrictions on the covariances between the FMM indicators and the DVs are obtained. 


Hi Linda and/or Bengt, A student and I are trying to fit our first causal indicator measurement model. From the syntax on slide 246 of the Mplus Short Courses Topic 1 Handout, it looks to us as if there is no disturbance term for f  your formative construct. Are we interpreting that correctly? If so, this would seem to us to be more akin to what Bollen would call a composite indicator model than a causal indicator model and we would be interested in guidance regarding syntax we should use to give the construct a disturbance term? Thanks very much! 


I am not sure what Bollen means by a causal indicator model. Can you send the paper to support@statmodel.com? 


Yes, the formative model we give the specification for is what BollenBauldry (2011) Psych Meth call composite indicators. What they refer to as causal indicators can simply be specified as a MIMICtype model; no special syntax needed. See their Figure 4, where the causal indicators behave like regular covariates  to me, it is more of a conceptual distinction. 


What alterations would I have to make to the model in slide 246 (topic 1) to obtain the model shown in Fig. 4a of the BollenBaudary paper? Is it possible? Thanks. 


What's the reference? You don't mean the BollenBauldry (2010) SM&R paper, right? 


Bollen & Bauldry (2011) Three Cs in Measurement Models:Causal Indicators, Composite Indicators, and Covariates. Psychological Methods 2011, Vol. 16, No. 3, 265–284 Figure 4a is on page 276. 


So I assume you want the parameterization of eq. (13) on page 11. So say: eta1 BY y1@1; [y1@0]; eta1 ON x1x3; [eta1]; 


Bollen and Bauldry suggest on p.279 (or p.14) looking at the indicator's unique validity variance, which they define as the difference between the rsquare for eta with all causal indicators and rsquare for eta, less causal variable x_i. Mplus provides rsquare for the latent variable. How can I get (or compute) the second value to obtain the unique validity variance? Thank you. 


Are you referring to eqn (21)? 


Eqn. 24 


I think you would have to do 2 different runs and get the 2 Rsquare values. 

Lois Downey posted on Thursday, June 30, 2016  1:20 pm



I used a suggestion from Bollen and Bauldry's article to build the following syntax for estimating a model with causal indicators: USEVARIABLES = y1 y2 x1x4; CATEGORICAL = y1 y2; MODEL: Factor by; Factor on x1@1 x2x4; y1 y2 on Factor; y1 with y2@0; Although that is a different method for achieving model identification than the one you suggest in your course handout, it avoids having to use a composite variable between the indicators and the latent variable of interest, and the model ran as expected. I then wanted to test the model for betweengroup invariance of the factor indicators. To do that, I added a grouping variable and made changes to the MODEL statement as follows: GROUPING = country (0=US 1=Canada); MODEL: Factor by; Factor on x1@1 x2 (1) x3 (2) x4 (3); [Factor@0]; y1 y2 on Factor; y1 with y2@0; MODEL Canada: [Factor]; However, this model resulted in an error message, indicating that the model may not be identified – and pointing to the factor intercept in the Canada group as the problematic component. What additional constraint(s) does the model need for statistical identification? Thanks! 


Your singlegroup model seems to have a free residual variance for the factor  is that really identified? Your twogroup model is not identified because the two intercepts of y1, y2 in their regressions on the factor cannot be identified together with the factor intercepts. It would be identified if you hold those y intercepts equal across the groups. 

Back to top 