Message/Author 

Roger Brown posted on Monday, February 16, 2015  9:44 am



I have been playing around with the new univariate entropy measures with an LCA using a variety of binary items. Sample size is fairly large >1500 cases. Once I seem to get larger than 17 binary items univariate entropy fails to estimate (999.000) and receive the following: THE MODEL ESTIMATION TERMINATED NORMALLY THE CHISQUARE TEST IS NOT COMPUTED BECAUSE THE FREQUENCY TABLE FOR THE LATENT CLASS INDICATOR MODEL PART IS TOO LARGE. I am assuming a limitation on the size of the matrix? Is there a way to override this? Thanks. Roger 


Yes  you can add "output:tech10". 


Hi, I am working with a model which is similar to example 7.20 in the user guide (p. 175), including both a continuous (f1), a categorical (c) latent variable and a set of indicators (y1y3). Is it possible to compute the univariate entropy for the continuous latent variable (f1 in example 7.20)? I have no problems obtaining the univariate entropy values for my indicators y1y3. Thanks in advance, David 


No, that is not available. 


Thanks for the quick reply! 


Another question on univariate entropy: My model contains a single categorical latent variable c and a number of indicators y1y6. Some of these indicators (y1y4) correspond to single items on a questionnaire, together they make up a factor (f), whilst other indicators (y5y6) correspond to the domains of a scale and are based on the mean of a number of items. Is it possible to directly compare the univariate entropies of indicator y1 with indicator y6? In example 7.20, I would be trying to compare the univariate entropy of indicator y3 with a hypothetical indicator y7 which is not part of any of the factors. Thanks in advance, 


I don't see any problems to compare those. 


Hi, Another question based on example 7.20 in the user guide. I understand that I cannot directly compute the univariate entropy for the continuous latent variable f1. Is there another method to compute that value, e.g. summing univariate entropies for the indicators y1y3? Thanks, David 


That sounds reasonable  use the average entropy of the indicators. 


Hello, Is it possible to manually compute univariate entropies based on Mplus output? Similar to how model entropy can be calculated based on the cprobs output. Thanks in advance, David 


You have to setup the "univariate" models, with 1 dependent variable and all parameters fixed to the parameters from the multivariate model. At that point the univariate model entropy is the univariate entropy and you can compute it manually using the cprobs from the univariate model. 


Thanks! 


Hi, If you were to specify which univariate entropy values would correspond with a low, medium and high univariate entropy, what would those values be? Thanks in advance, David 


Don't quote me, but I think of High: 0.8  1.0 Medium: 0.6  0.8 Low: Below 0.6 


Hi, I'm wondering whether univariate entropy values can be used to evaluate the quality of indicators for longitudinal mixture models? I'm trying to troubleshoot for a 4 class LCGA model for a 5category ordinal outcome using 7 waves of data that has low entropy (0.53). I’d thought that the low entropy was largely because of substantial sample attrition. However, I requested the univariate entropy values and they’re all uniformly low (range: 0.2700.295!). Are univariate entropy values valid for these types of models? Does this suggest that we should reconsider our latent class indicators? 


Sharon Currently the univariate entropy doesn't take into account the missing data and it sort of implies what the entropy would be if all the data is observed. We will change this in the next Mplus version to account properly for the missing data. Thanks for pointing this out. Tihomir 


how to get the univariate entropy for each item to select the items with higher values of univariate entropy to represent the latent variable? 


Variablespecific entropy is given in the output since version 7.3. 


how to get the univariate entropy in the output? what is the command? 


Our online UG says that you say Entropy in the Output command. 

Back to top 