Message/Author 


I am analysing data using Bayesian mixture model. I am checking autocorrelation of the generated data and wondering whether I have to ensure that all chains must meet the criteria of having autocorrelation less than .1, or just one chain is enough. 


If it is a long chain, one should be enough for this check. And, look out for label switching by investigating the trace line. 


How many iterations are considered to be long enough? And, does "label switching" mean the red vertical line in the trace plot? 


Hard to say; perhaps twice as long as it took to get the PSR down to convergence. For label switching, see Section 8.2 of our technical report on our web site: Asparouhov, T. & Muthén, B. (2010). Bayesian analysis using Mplus: Technical implementation. Technical Report. Version 3. as well has the handout for Topic 9 on our web site, slides 8285, especially the trace plot on slide 84, showing a low and a high set of estimates corresponding to two different solutions. The label switching topic needs to be well understood before doing Bayes mixture analyses. 

Carolyn Hou posted on Monday, February 28, 2011  6:18 pm



I have several questions regarding Bayes mixture modeling. 1. When "Data imputation: Plausible=" commands are used, is the algorithm still MCMC by default? 2. What is the difference conceptually between parameter estimates with and without using this command? 3. Which set of parameter estimates should be used for more accuracy, those with this command, or those without it? Thank you very much. 


1. Yes. 23. There is no difference. Plausible values like factor scores are computed after model estimation. 

Tim Stump posted on Friday, September 28, 2012  7:56 am



I am using estimator=bayes to run LCA of 11 binary items. My question is about how to best know when the process has converged. I set fbiterations to a large value, say 25,000. Some of my models conclude with PSR<1.1, but another one I ran had PSR hovering between 8 and 10 and never getting close to 1. Do you have suggestions on what you might do in a case like this? If the model is misspecified, would this be a characteristic you see in the model fitting process? I'm curious about a general strategy for model fitting/checking for convergence using bayes for lca. 

Tim Stump posted on Friday, September 28, 2012  8:19 am



Just a followup to my previous post...let's say you have a PSR that hovers between 2 and 4 hypothetically for a large number of fbiterations, not as extreme as PSR between 810. What might be a next step to figure out how to obtain convergence? 


To do Bayesian estimation of mixture models takes some studying because it is not a straightforward analysis. A key issue is label switching. This is discussed in the 6/1/11 Topic 9 course video and handout on our web site. The topic of using PSR also needs to be studied and issues like premature PSR convergence and nonidentification is discussed in the new Utrecht video and handout on our web site. PST values of 24 are not really any different from810 in that both settings indicate problems in the "mixing" (convergence), either due to label switching or nonidentification. 


Hello, I am trying to model a multigroup Latent Markov Chain using known classes. While I was able to run the model using MLR with random starts I would like know if this type of analysis is possible using Bayesian estimation. For the moment I keep getting this error: "Analysis with more than one categorical latent variables is not allowed with ESTIMATOR=BAYES." Thank you, Alex 


This is not allowed yet with BAYES. 

Alden Gross posted on Tuesday, August 27, 2013  2:53 pm



Hello, In growth mixture models and latent class analysis, neither latent class regression nor postestimation tests of means using the auxiliary option are available with ESTIMATOR=BAYES. How, then, can I appropriately examine predictors of latent class membership when my estimator==bayes? I do not want the variables to affect latent class membership. 


For LCA, you can do 3step analysis, but for GMM that is trickier  see Web Note 15. 


I mean doing 3step "manually" as described in the web note. 

Alden Gross posted on Wednesday, August 28, 2013  2:56 pm



Thank you! I replicated the webnote and adapted it for my purposes. 

Ted Fong posted on Thursday, March 27, 2014  8:10 am



I have conducted a Bayes mixture model with one categorical latent variable and 2 continuous latent variables. I would like to ask: 1) Is 'Label switching' still an issue of concern in Bayes mixture modeling in Mplus 7? 2) It seems Tech11 and Tech13 are not avaliable with Bayes estimator. The output file did not show any results for Tech12 and Tech14. Are these two also not available with Bayes? 3) With Bayes, all of the auxiliary options (R3STEP/DU3STEP/E/DCAT/DCON)for comparing are not available, right? 4) There are no BIC/DIC shown in my Bayes mixture model. I can only find the posterior predictive p value. In this case, do you have any suggestions on what other tools to determine the model fit of competing models? Thank you very much. 


1. Yes. 2. These are not available with Bayes. 3. Correct. 4. Nothing more is available at this time. 

Ted Fong posted on Thursday, March 27, 2014  8:11 pm



Dear Dr. Muthén, Thanks so much for the prompt clarification on mixture modeling using Bayesian estimation! Ted 

Dena Pastor posted on Tuesday, January 24, 2017  10:11 am



Is DIC still not available for Bayesian LCAs in Mplus? 


Still not available. DIC for the general Mplus mixture model is difficult or impossible to compute and we have not made progress on the special cases where it is feasible such as LCA. 


Hello, My GMM exported means plot data doesn't conceptually make sense to me. Here is my exported plot data: Class 1 Class 2 0 3.58814 3.24167 1 3.53889 2.98282 2 3.55322 2.78756 3 3.57139 2.59613 According to my model output, my intercept for Class 1 is 3.336, and for class 2, 2.989. Shouldn't the means of my plot data at t0 be the same as the class 1 and class 2 intercepts? Thanks, Nicole 


Send your full output to Support along with the exported plot data file and your license number. 


Hello Drs. Muthen  Earlier in this thread (March 7 2014) I see that it is not possible to do the (automatic?) auxiliary variable options with the Bayes estimator. From my attempts it seems this is still the case in Version 8, but I'm wondering if I can implement the manual 3step approaches as mentioned in webnotes (specifically for the BCH and DCAT methods)? I have attempted to do this, and get the overall model to converge just fine, but whenever I specify SAVE = bchweights, I get the following error: SAVE=BCHWEIGHTS is not available with ALGORITHM=INTEGRATION Elsewhere I have seen discussion of this error and it was said that integration is needed for data with missingness  so I tried with complete data and got the same error. Thanks for you help! 


First note that we do not have a manual DCAT approach (just to be clear)  we have the manual 3step approach that can be used with categorical distal outcomes and that approach is described in Web note 15, section 3 or page 332 http://cds.web.unc.edu/files/2017/03/3_Asparouhov_and_Muthen_2014.pdf I do think that the manual BCH and 3step can in principle be conducted with the Bayes estimator, however, we have not done it yet and a number of the steps would require a bit more work than with the ML estimator. Bayes manual 3step would be easier to do for you. Manual Bayes BCH would not be easy at all for you and MplusBayes will not accept any weights (even BCH weight). 

Back to top 