

Individual estimates: simultaneous m... 

Message/Author 


I am trying to estimate the following simultaneous system. R1= a1 + b1* R2 + c1* I1+ c2*I2 +error R2 = a2 + b2* R1 + c3*I3 + error R1 and R2 are interdependent, I1,I2 and I 3 are exogenous and all are manifest variables. I want to allow the vector (a1,a2,b1,b2,c1,c2,c3) to vary across classes. say I use 300 data points to estimate the same and I obtain a 2 class solution. I know mplus prints the posterior class probabilities for each of the 1000 points. Can you tell me how to obtain individual level estimates for all the 1000 points? Can you also tell me why this is so mathematically, i.e. by refering me to the answer as derived from the likelihood function being estimated? Thanks!!!! Hari 


Posterior class probabilities are given for each of the 300 data points (I assume n=300 individuals) for each of the 2 classes. The posterior probability is computed by equation (161) of Appendix 8 (from Version 2) which is on the web site. Here, the RHS numerator is the likelihood contribution for individual i. 


Thank you for that. however, i was wondering how to get individual level paramater estimates from the posterior probabilities of my model 


I don't know what you mean by "individuallevel parameter estimates"  can you please clarify? This is not a multilevel model where the parameters vary across individuals. 


My goal is to obtain parameter estimates for each of the subjects in my sample using the posterior probabilities of the latent class model. For example, Mantrala et al (Journal of Marketing Research, November 2006) estimate a latent class multinomial logit model for a few hunded stores. Then they obtain store specific paramater estimates, using this pocedure. " Estimation of StoreSpecific Demand Parameters We use the estimated parameters of the heterogeneous MNL model ({K, á1k, á2k, á3k, âk, ðk}; k = 1, …, K) in an empirical Bayes procedure to estimate storespecific demand parameters for store s, as we show subsequently (e.g., Kamakura and Russell 1989; Rossi and Allenby 1993). Step 1. We compute the store’s conditional likelihood function Ls (äk), given that the store belongs to support k: We repeat this calculation for each segment to obtain Ls(ä1),…, Ls(äK). Step 2. We use Bayes’ rule to obtain the store’s posterior support membership probability for support k. Again, the priors are equal to the probability masses, ðsk, estimated in the maximum likelihood routine. We repeat this calculation for each supportto obtain ðs1 post, …, ðsK post (appropriately renormalizing them to sum to 1). The storespecific demand parameters are now ({á1k, á2k, á3k, âk, ðsk post}; k = 1, …, K)" 


Can you please send me a copy of Mantrala's article so can see what you mean? 


I read the Mantrala et al. (2007) article in Journal of Marketing Research that you referred to. They consider 2level data with shoppers making choices within stores. They discuss a 2level random effect multinomial logistic regression model where the coefficients vary across stores. They point out that Kamakura & Russel (1989) use a semiparametric approach to this, where is sounds like they replace normality assumptions on the random effects with a nonparametric approach via mixtures. Both approaches can be done in Mplus  the mixture approach is done by using a level 2 latent class variable in line with the Version 4.2 addendum examples. 


Thanks a lot! Can i do it for a moel where i have 300 data points on aggregate crosssectional data, i.e. the model i highlight in message 1 in this thread? i.e R1= a1 + b1* R2 + c1* I1+ c2*I2 +error R2 = a2 + b2* R1 + c3*I3 + error R1 and R2 are interdependent, I1,I2 and I 3 are exogenous and all are manifest variables. I want to allow the vector (a1,a2,b1,b2,c1,c2,c3) to vary across classes. 


Well, I don't see that your situation has a multilevel multinomial feature which was the key issue in Mantrala et al. But if you are asking if you can do a mixture version of your singlelevel simultaneous equation system for continuous outcomes the answer is yes. And yes, you can get posterior probabilities for each data point in those mixture classes. 

Back to top 

