Bayesian mixture model PreviousNext
Mplus Discussion > Latent Variable Mixture Modeling >
 Siddhi Pittayachawan posted on Wednesday, February 02, 2011 - 4:22 pm
I am analysing data using Bayesian mixture model. I am checking autocorrelation of the generated data and wondering whether I have to ensure that all chains must meet the criteria of having autocorrelation less than .1, or just one chain is enough.
 Bengt O. Muthen posted on Wednesday, February 02, 2011 - 5:06 pm
If it is a long chain, one should be enough for this check.

And, look out for label switching by investigating the trace line.
 Siddhi Pittayachawan posted on Wednesday, February 02, 2011 - 5:40 pm
How many iterations are considered to be long enough?

And, does "label switching" mean the red vertical line in the trace plot?
 Bengt O. Muthen posted on Thursday, February 03, 2011 - 11:52 am
Hard to say; perhaps twice as long as it took to get the PSR down to convergence.

For label switching, see Section 8.2 of our technical report on our web site:

Asparouhov, T. & Muthén, B. (2010). Bayesian analysis using Mplus: Technical implementation. Technical Report. Version 3.

as well has the handout for Topic 9 on our web site, slides 82-85, especially the trace plot on slide 84, showing a low and a high set of estimates corresponding to two different solutions. The label switching topic needs to be well understood before doing Bayes mixture analyses.
 Carolyn Hou posted on Monday, February 28, 2011 - 6:18 pm
I have several questions regarding Bayes mixture modeling.
1. When "Data imputation: Plausible=" commands are used, is the algorithm still MCMC by default?
2. What is the difference conceptually between parameter estimates with and without using this command?
3. Which set of parameter estimates should be used for more accuracy, those with this command, or those without it?

Thank you very much.
 Linda K. Muthen posted on Tuesday, March 01, 2011 - 9:27 am
1. Yes.
2-3. There is no difference. Plausible values like factor scores are computed after model estimation.
 Tim Stump posted on Friday, September 28, 2012 - 7:56 am
I am using estimator=bayes to run LCA of 11 binary items. My question is about how to best know when the process has converged. I set fbiterations to a large value, say 25,000. Some of my models conclude with PSR<1.1, but another one I ran had PSR hovering between 8 and 10 and never getting close to 1. Do you have suggestions on what you might do in a case like this? If the model is mis-specified, would this be a characteristic you see in the model fitting process? I'm curious about a general strategy for model fitting/checking for convergence using bayes for lca.
 Tim Stump posted on Friday, September 28, 2012 - 8:19 am
Just a follow-up to my previous post...let's say you have a PSR that hovers between 2 and 4 hypothetically for a large number of fbiterations, not as extreme as PSR between 8-10. What might be a next step to figure out how to obtain convergence?
 Bengt O. Muthen posted on Friday, September 28, 2012 - 8:50 am
To do Bayesian estimation of mixture models takes some studying because it is not a straightforward analysis. A key issue is label switching. This is discussed in the 6/1/11 Topic 9 course video and handout on our web site. The topic of using PSR also needs to be studied and issues like premature PSR convergence and non-identification is discussed in the new Utrecht video and handout on our web site.

PST values of 2-4 are not really any different from8-10 in that both settings indicate problems in the "mixing" (convergence), either due to label switching or non-identification.
 Alexandru Cernat posted on Tuesday, June 04, 2013 - 12:23 am

I am trying to model a multigroup Latent Markov Chain using known classes. While I was able to run the model using MLR with random starts I would like know if this type of analysis is possible using Bayesian estimation. For the moment I keep getting this error: "Analysis with more than one categorical latent variables is not allowed with ESTIMATOR=BAYES."

Thank you,

 Linda K. Muthen posted on Tuesday, June 04, 2013 - 7:26 am
This is not allowed yet with BAYES.
 Alden Gross posted on Tuesday, August 27, 2013 - 2:53 pm

In growth mixture models and latent class analysis, neither latent class regression nor post-estimation tests of means using the auxiliary option are available with ESTIMATOR=BAYES.

How, then, can I appropriately examine predictors of latent class membership when my estimator==bayes? I do not want the variables to affect latent class membership.
 Bengt O. Muthen posted on Tuesday, August 27, 2013 - 3:03 pm
For LCA, you can do 3-step analysis, but for GMM that is trickier - see Web Note 15.
 Bengt O. Muthen posted on Tuesday, August 27, 2013 - 3:10 pm
I mean doing 3-step "manually" as described in the web note.
 Alden Gross posted on Wednesday, August 28, 2013 - 2:56 pm
Thank you! I replicated the webnote and adapted it for my purposes.
 Ted Fong posted on Thursday, March 27, 2014 - 8:10 am
I have conducted a Bayes mixture model with one categorical latent variable and 2 continuous latent variables. I would like to ask:

1) Is 'Label switching' still an issue of concern in Bayes mixture modeling in Mplus 7?

2) It seems Tech11 and Tech13 are not avaliable with Bayes estimator. The output file did not show any results for Tech12 and Tech14. Are these two also not available with Bayes?

3) With Bayes, all of the auxiliary options (R3STEP/DU3STEP/E/DCAT/DCON)for comparing are not available, right?

4) There are no BIC/DIC shown in my Bayes mixture model. I can only find the posterior predictive p value. In this case, do you have any suggestions on what other tools to determine the model fit of competing models?

Thank you very much.
 Linda K. Muthen posted on Thursday, March 27, 2014 - 1:51 pm
1. Yes.
2. These are not available with Bayes.
3. Correct.
4. Nothing more is available at this time.
 Ted Fong posted on Thursday, March 27, 2014 - 8:11 pm
Dear Dr. Muthén,

Thanks so much for the prompt clarification on mixture modeling using Bayesian estimation!

 Dena Pastor posted on Tuesday, January 24, 2017 - 10:11 am
Is DIC still not available for Bayesian LCAs in Mplus?
 Tihomir Asparouhov posted on Tuesday, January 24, 2017 - 6:56 pm
Still not available. DIC for the general Mplus mixture model is difficult or impossible to compute and we have not made progress on the special cases where it is feasible such as LCA.
 Nicole Tuitt posted on Saturday, October 06, 2018 - 1:40 pm

My GMM exported means plot data doesn't conceptually make sense to me. Here is my exported plot data:

Class 1 Class 2
0 3.58814 3.24167
1 3.53889 2.98282
2 3.55322 2.78756
3 3.57139 2.59613

According to my model output, my intercept for Class 1 is 3.336, and for class 2, 2.989. Shouldn't the means of my plot data at t0 be the same as the class 1 and class 2 intercepts?


 Bengt O. Muthen posted on Saturday, October 06, 2018 - 3:32 pm
Send your full output to Support along with the exported plot data file and your license number.
 Nicholas J Parr posted on Monday, April 08, 2019 - 1:11 pm
Hello Drs. Muthen -

Earlier in this thread (March 7 2014) I see that it is not possible to do the (automatic?) auxiliary variable options with the Bayes estimator. From my attempts it seems this is still the case in Version 8, but I'm wondering if I can implement the manual 3-step approaches as mentioned in webnotes (specifically for the BCH and DCAT methods)?

I have attempted to do this, and get the overall model to converge just fine, but whenever I specify SAVE = bchweights, I get the following error:


Elsewhere I have seen discussion of this error and it was said that integration is needed for data with missingness - so I tried with complete data and got the same error.

Thanks for you help!
 Tihomir Asparouhov posted on Monday, April 08, 2019 - 7:10 pm
First note that we do not have a manual DCAT approach (just to be clear) - we have the manual 3-step approach that can be used with categorical distal outcomes and that approach is described in Web note 15, section 3 or page 332

I do think that the manual BCH and 3-step can in principle be conducted with the Bayes estimator, however, we have not done it yet and a number of the steps would require a bit more work than with the ML estimator. Bayes manual 3-step would be easier to do for you. Manual Bayes BCH would not be easy at all for you and Mplus-Bayes will not accept any weights (even BCH weight).
Back to top
Add Your Message Here
Username: Posting Information:
This is a private posting area. Only registered users and moderators may post messages here.
Options: Enable HTML code in message
Automatically activate URLs in message