What is a good value of entropy PreviousNext
Mplus Discussion > Latent Variable Mixture Modeling >
Message/Author
 Jon Heron posted on Thursday, September 20, 2007 - 5:37 am
Ramaswamy's paper on entropy does not appear to indicate what a good value of entropy would be - merely that 0.62 would indicate 'fuzzyness'

I would like to add a sentence to my methods section with words to effect of

"Entropy values over 0.8 indicate a good separate of the latent classes (ref)"

pl can you help?


cheers



Jon
 Matthew Cole posted on Thursday, September 20, 2007 - 6:02 am
Here's what I use:

Entropy with values approaching 1 indicate clear delineation of classes (Celeux & Soromenho, 1996).

Celeux, G., & Soromenho, G. (1996). An entropy criterion for assessing the number of clusters
in a mixture model. Journal of Classification, 13, 195-212.
 Jon Heron posted on Thursday, September 20, 2007 - 6:05 am
Cheers Matthew, I was unable to get hold of that paper.

I have an entropy of 0.935, I guess that's approaching one, whichever way you look at it.



J
 Jon Heron posted on Thursday, September 20, 2007 - 6:09 am
Found it after all - hurrah!

Note to self - dont always rely on Pubmed
 Andy Ross posted on Friday, November 21, 2008 - 4:56 am
Dear Prof. Muthen

Following on from Jon's query

In your opinion what is the general thinking on quality of classification as a criterion for accepting a latent class solution as useful? For example, would you disregard a solution with an entropy lower than .8 as a fairly poor representation of a population, because it cannot distinguish very well?

Also, are there any strategies for improving entropy - i.e. is poor classification often linked with a specific attribute of a model? I ask because I've recently found models that start with an entropy of appoximately .5/.6 when estimating two classes often remain fairly poor at classifying even as the number of estimated classes is increased.


Many thanks

Andy
 Bengt O. Muthen posted on Friday, November 21, 2008 - 7:53 am
The quality of classification as measured by entropy has different impact in different settings. For example, you could have poor entropy and still be able to distinguish some of the classes very clearly. Or, you could use your LCA to predict a distal outcome from the latent classes and get a significant relationship that is estimated with small SE even with a low entropy. The use of "most likely class membership" as a variable for further analysis, however, is problematic when the entropy goes much lower than 0.8.

Best strategy for improving entropy is to add good indicators - indicators that discriminate well between the classes. Given a certain set of indicators, however, you would first find the model that fits the data best and then accept the entropy it gives.
 Harald Gerber posted on Tuesday, March 17, 2009 - 6:37 am
I've heard that entropy value partially depends on number of classes (unfortunatelly, I can't remember where) and tends to be smaller for less classes.
If so, could you - very briefly - explain why? Thank you very much.
 Bengt O. Muthen posted on Friday, March 20, 2009 - 1:17 pm
I have not heard that, but I can imagine that entropy might be lower for smaller numbers of classes due to more classes being needed to clearly separate clusters of people.
Back to top
Add Your Message Here
Post:
Username: Posting Information:
This is a private posting area. Only registered users and moderators may post messages here.
Password:
Options: Enable HTML code in message
Automatically activate URLs in message
Action: