Message/Author 


I am analysing data using Bayesian mixture model. I am checking autocorrelation of the generated data and wondering whether I have to ensure that all chains must meet the criteria of having autocorrelation less than .1, or just one chain is enough. 


If it is a long chain, one should be enough for this check. And, look out for label switching by investigating the trace line. 


How many iterations are considered to be long enough? And, does "label switching" mean the red vertical line in the trace plot? 


Hard to say; perhaps twice as long as it took to get the PSR down to convergence. For label switching, see Section 8.2 of our technical report on our web site: Asparouhov, T. & Muthén, B. (2010). Bayesian analysis using Mplus: Technical implementation. Technical Report. Version 3. as well has the handout for Topic 9 on our web site, slides 8285, especially the trace plot on slide 84, showing a low and a high set of estimates corresponding to two different solutions. The label switching topic needs to be well understood before doing Bayes mixture analyses. 

Carolyn Hou posted on Monday, February 28, 2011  6:18 pm



I have several questions regarding Bayes mixture modeling. 1. When "Data imputation: Plausible=" commands are used, is the algorithm still MCMC by default? 2. What is the difference conceptually between parameter estimates with and without using this command? 3. Which set of parameter estimates should be used for more accuracy, those with this command, or those without it? Thank you very much. 


1. Yes. 23. There is no difference. Plausible values like factor scores are computed after model estimation. 

Tim Stump posted on Friday, September 28, 2012  7:56 am



I am using estimator=bayes to run LCA of 11 binary items. My question is about how to best know when the process has converged. I set fbiterations to a large value, say 25,000. Some of my models conclude with PSR<1.1, but another one I ran had PSR hovering between 8 and 10 and never getting close to 1. Do you have suggestions on what you might do in a case like this? If the model is misspecified, would this be a characteristic you see in the model fitting process? I'm curious about a general strategy for model fitting/checking for convergence using bayes for lca. 

Tim Stump posted on Friday, September 28, 2012  8:19 am



Just a followup to my previous post...let's say you have a PSR that hovers between 2 and 4 hypothetically for a large number of fbiterations, not as extreme as PSR between 810. What might be a next step to figure out how to obtain convergence? 


To do Bayesian estimation of mixture models takes some studying because it is not a straightforward analysis. A key issue is label switching. This is discussed in the 6/1/11 Topic 9 course video and handout on our web site. The topic of using PSR also needs to be studied and issues like premature PSR convergence and nonidentification is discussed in the new Utrecht video and handout on our web site. PST values of 24 are not really any different from810 in that both settings indicate problems in the "mixing" (convergence), either due to label switching or nonidentification. 


Hello, I am trying to model a multigroup Latent Markov Chain using known classes. While I was able to run the model using MLR with random starts I would like know if this type of analysis is possible using Bayesian estimation. For the moment I keep getting this error: "Analysis with more than one categorical latent variables is not allowed with ESTIMATOR=BAYES." Thank you, Alex 


This is not allowed yet with BAYES. 

Alden Gross posted on Tuesday, August 27, 2013  2:53 pm



Hello, In growth mixture models and latent class analysis, neither latent class regression nor postestimation tests of means using the auxiliary option are available with ESTIMATOR=BAYES. How, then, can I appropriately examine predictors of latent class membership when my estimator==bayes? I do not want the variables to affect latent class membership. 


For LCA, you can do 3step analysis, but for GMM that is trickier  see Web Note 15. 


I mean doing 3step "manually" as described in the web note. 

Alden Gross posted on Wednesday, August 28, 2013  2:56 pm



Thank you! I replicated the webnote and adapted it for my purposes. 

Ted Fong posted on Thursday, March 27, 2014  8:10 am



I have conducted a Bayes mixture model with one categorical latent variable and 2 continuous latent variables. I would like to ask: 1) Is 'Label switching' still an issue of concern in Bayes mixture modeling in Mplus 7? 2) It seems Tech11 and Tech13 are not avaliable with Bayes estimator. The output file did not show any results for Tech12 and Tech14. Are these two also not available with Bayes? 3) With Bayes, all of the auxiliary options (R3STEP/DU3STEP/E/DCAT/DCON)for comparing are not available, right? 4) There are no BIC/DIC shown in my Bayes mixture model. I can only find the posterior predictive p value. In this case, do you have any suggestions on what other tools to determine the model fit of competing models? Thank you very much. 


1. Yes. 2. These are not available with Bayes. 3. Correct. 4. Nothing more is available at this time. 

Ted Fong posted on Thursday, March 27, 2014  8:11 pm



Dear Dr. Muthén, Thanks so much for the prompt clarification on mixture modeling using Bayesian estimation! Ted 

Dena Pastor posted on Tuesday, January 24, 2017  10:11 am



Is DIC still not available for Bayesian LCAs in Mplus? 


Still not available. DIC for the general Mplus mixture model is difficult or impossible to compute and we have not made progress on the special cases where it is feasible such as LCA. 

Back to top 