Message/Author 

Anonymous posted on Tuesday, October 12, 2004  2:59 pm



I'm working on a latent profile analysis with a sample of 180 sexual aggressors. Unfortunately, I have about 10 predictors so am having problems with the models terminating normally. I don't expect that all of the predictors will be important for the final LPA but am unsure how to reduce the number of variables. Any suggestions? Thanks in advance for your response. 


First, check that you are using Version 3.11 where attempts have been made to make it easier to have many x variables. An approximate exploratory approach would be to do the LPA without covariates, classify individuals into their most likely class, and then relate the covariates to the most likely class. 

Anonymous posted on Friday, November 05, 2004  10:05 am



Here is my problem. I have a sample of 70 service programs in highrisk communities, with 4 to 6 continuous indicators or potentially 4 to 6 dichotomous indicators. I simply want to explore the structure of the data to see if there are potential latent classes. In the article by Cleland, Rothschild & Haslam 2000, Psychological Reports (87) they conclude that mixture models are less likely to produce false positives (finding a class that does not exist) at the expense of missing true latent classes. In this paper they only look at samples of 100, 300, or 600. If I am willing to accept that potential, given my study is exploratory, do you see any other drawbacks in running and LPA or LCA with a sample of 70 with potentially 4 to 6 indicators? Is there anything else that may effect the results, given the sample size (other than the failure to detect latent classes)? Moreover, if I find latent classes (even say only 2), how much am I taxing the model by adding one or two background variables. Many thanks in advance!! 

bmuthen posted on Friday, November 05, 2004  11:43 am



With 70 observations it may be hard to identify many classes due to too small class sizes. If only 2 substantively meaningful classes are expected, this may be ok  even with 46 indicators. Finding an appropriate number of classes is not straightforward even with larger samples and depends on which statistic is used. Our simulations seem to favor BIC and particularly the samplesize adjusted BIC, but we haven't gone below n=200. The small sample size also causes the parameters and SEs to be less well estimated. Adding a background variable predicting class actually helps the analysis quite a bit in terms of adding information to give more stability to the estimation. Note also that Mplus can be used to easily do your own Monte Carlo simulation study to study these matters. 


To clarify is there a rule or restriction on the number of indicators one can use when deriving a given subset (1....k) of latent classes using a given sample size? I am currently working with 15 indicators using a sample of N = 257. I am exploring 24 latent class solutions but was wondering if my number of indicators was too high large for my sample size? 


You want to have more subjects than parameters, which restricts the number of indicators and latent classes you can have. 

Maja Flaig posted on Friday, February 20, 2015  6:06 am



Dear Linda and Bengt, I am working on several latent profile analyses with 522 continous indicators and N=350. In 3 analyses I got this message when conducting the analysis with more than 2 or 3 classes: THE STANDARD ERRORS OF THE MODEL PARAMETER ESTIMATES MAY NOT BE TRUSTWORTHY FOR SOME PARAMETERS DUE TO A NONPOSITIVE DEFINITE FIRSTORDER DERIVATIVE PRODUCT MATRIX. THIS MAY BE DUE TO THE STARTING VALUES BUT MAY ALSO BE AN INDICATION OF MODEL NONIDENTIFICATION. THE CONDITION NUMBER IS 0.522D16. PROBLEM INVOLVING PARAMETER 55. Do you have any suggestion how to handle the problem? Thanks in advance for your response! 


Can't say without seeing the full output including TECH8. Please send to support along with license number. 

Back to top 