Message/Author 

Jan Ivanouw posted on Saturday, November 06, 2010  10:28 am



When trying to perform EFA with Baeyes estimation, the output file does not contain the usual information. There is the usual logo in the top left corner (Muthén & Muthén, etc.), but besides from that, the output file is a direct replicate of the input file. No information of calculation time or anything. No error messages either. Also, the output file does not appear on the screen as usual, but is only saved on the disk. What is wrong? 


Bayes is not yet available with EFA. See Version History for version 6. 

Jan Ivanouw posted on Saturday, November 06, 2010  11:29 am



Oh, I have misunderstod your presentation from John Hopkins, August 18, 2010 where you talked about an EFA example, but it must have been a nonBayes version. Looking forward to a Bayesimplementation! 


Yes, the EFA was ML. Yes, Bayes EFA will be very useful, also taking care of Heywood cases and small samples. 


Greetings Drs. Muthen, I understand that EFA cannot be executed using bayes. My question is, can a more constrained model such as an ecfa be done using the bayes estimator? Furthermore, my model did converge, yet I am more about whether there are negative implications, or if it is even practical to do so on bayes. My model is below this email. Also, I would like to take the opportunity to thank the Mplus team for devoting so much time and effort into this great tool, I am having such a blast learning new things about Mplus everyday. ANALYSIS: ESTIMATOR = BAYES; PROCESS = 2; FBITER = 20000; STVAL = ml; MODEL: F1 BY M1M20*.5 M10@0 M17@0; F2 BY M1M20*.5 M17@0 M18@0; F3 BY M1M20*.5 M10@0 M18@0; F1F3@1; 


Bayes can do EFA within CFA. I don't see any negative consequences. A positive outcome is no negative residual variances. There may be in some cases be convergence difficulties given such a relaxed model. The real advantage with Bayes in factor analysis I think is the possibility discussed in: http://www.statmodel.com/download/BSEMv4.pdf 

Jan Zirk posted on Thursday, November 01, 2012  12:07 pm



Dear Bengt Linda or Tihomir My questions concern comparison of results of a Bayesian and WLSMV EFA 18. 30 categorical items (7 categories) were used (big sample: n=83548, so the Bayesian EFA took 125h while the WLSMV less than 1h). The theory of the tested instrument suggested 5 factors. According to eigenvalues, which are the same in both estimation methods, the 7 factor solution is the first with eigv>1. WLSMV computed all 8 solutions and showed goodnessoffit indices for all of them. Bayesian EFA provided output for up to 5 factors and there was no convergence for 6,7 & 8. My first question is: 1)Can lack of convergence for 68factors be used as evidence for preference of the 5factor solution? In the next step a 1factor Bayesian CFA for all 30 items was run for item scaling (thus the mean plausible values of the 30 categorical items were extracted). These continuous measures were used in the Bayesian EFA18 (which was much faster this time). And this time there was convergence for 6factor, solution which was better than 5factor, and there was no convergence for 7&8. So my second question is: 2) Would plausible values of the latent response variables of the categorical items from 1factor CFA be the same as latent response plausible values of these categorical measures from an analysis with different structural properties (e.g. 5factor EFA)? Best wishes, Jan 


I would not choose the number of factors based on eigenvalues > 1. If you want to use eigenvalues, I think it is better to look for a "break (an "elbow"). Bayes EFA is slower than WLSMV EFA when the sample size gets larger and also has computational difficulties when factors are weakly measured as happens when overfactoring. 1) Maybe, but that is weak evidence and doesn't say if 5 factors provides a good model. 2) If you want to create plausible values behind each of the 30 categorical variables, try using the "H1" model u1u30 with u1u30; Bayes then gives you plausible values for the 30 continuous u* variables. 

Jan Zirk posted on Friday, November 02, 2012  2:16 am



Thanks so much! 

Jan Zirk posted on Friday, November 02, 2012  2:18 am



I noticed the parallel analysis facility so will also use this approach with MLR when I have the plausible values extracted. Best wishes, Jan 


Note that parallel is for continuous variables only, so you have to assume this approximation. We noticed that it didn't work well for tetrachoric and polychoric correlation matrices. 

Jan Zirk posted on Friday, November 02, 2012  9:48 pm



That's right! I meant running the parallel analysis on the plausible values measures from Bayesian approach. Best wishes, Jan 


Did your run with u1u30 with u1u30; work out? 

Jan Zirk posted on Friday, November 02, 2012  10:23 pm



Still running, I think that I will know in the morning (I mean in about 10h; I had to first run different analyses which took some time. Will let you know as soon as I have them. B/w Jan 

Jan Zirk posted on Friday, November 02, 2012  10:48 pm



To my suprise the model have just been found identified (after 800 iterations(!) ) and now the imputations are being generated (n=10). This holds promise for much faster further analysis. All the best, Jan 

Jan Zirk posted on Friday, November 02, 2012  11:30 pm



I would like to ask you about Bayes factor. As far as I understand the first information provided in the output on the Bayes factor indicates preference of the more complex model if the value is bigger than 3 (according to convention). Where can I find then the information how the Log of Bayes factor is computed and what it can be useful for? Is there an article describing how it is computed in Mplus? My second question is could you provide me with a reference to an article which would show how ordinal variables are treated under Bayesian estimation in comparison to WLSMV probit and ML logistic approach? I am trying to understand the mechanism underlying extraction of plausible values from categorical variables without necessity of regressing them on a latent factor (like in the H1 model that you mentioned). With best wishes, Jan 


The Log of the Bayes factor is just that: the log of the Bayes factor. It does not provide any new information. It is given for convenience. You can see that way how the Bayes factor is computed. The Bayes EFA methodology is described in http://statmodel.com/download/HighDimension9.pdf The EFA model uses standardized metric. The estimates and factor score for the 3 estimators are in the same standardized metric and are directly comparable. 

Jan Zirk posted on Tuesday, November 06, 2012  4:52 pm



Oh, Thank you for the article; That is really useful. Now I understand that the default for ordered categorical is probit. I noticed that my first question on the 'log of the bayes factor' was really confusing. Of course, 'log' means just a logarithm of the BF value; the BF is so high in all my models that the output shows only approximation (>1000000). Everything is clear 

Back to top 