|
Message/Author |
|
|
dear all, I would like to ask you two questions about np-GMM(quadratic np-GMM Zip), namely: 1. I would like to test the model fit of my np-GMMs by means of the BLRT, as already shown in Kreuter & Muthen (2007). I use TECH14, but in the output I am told that a BLRT is not possible for models with more than 1 categorical latent class. How have you managed? What can i do? 2. how can I specify, within the same model, a different number of np-classes for each sub-class (e.g. a sub-class with 2 and a sub-class with 3 np-classes)? Thanks a lot in advance, Luca |
|
|
1. BLRT is only for models with a single latent class variable. It was used in the paper for regular GMM, not np-GMM. I would rely on BIC instead. 2. I think that type of input is shown in Kreuter, F. & Muthen, B. (2007). Longitudinal modeling of population heterogeneity: Methodological challenges to the analysis of empirically derived criminal trajectory profiles. Forthcoming in Hancock, G. R., & Samuelsen, K. M. (Eds.). (2007). Advances in latent variable mixture models. Charlotte, NC: Information Age Publishing, Inc. which is on our web site. Otherwise, I can send input. |
|
|
Thanks for the quick answers. concerning question 2: 2. in the article you cieted the imput is reported. However, it is not explained how you can specify a different number of np-classes for each sub-classes. (i.e. 2 np-classes for the first sub-class, and 3 np-cl for the second sub-cl). This is done in Kreuter, F. & Muthen, B. (2007). Analyzing criminal trajectory profiles: Bridging multilevel and group-based approaches using growth mixture modeling. Conditionally accepted for publication in Journal of Quantitative Criminology. but the imput is not shown. Could you please send me the imput? thanks again luca |
|
|
Please email me at bmuthen@ucla.edu and I will send you 2 outputs from our analyses, one with the same number of np classes (3+3) and one with different numbers (3+2). |
|
|
dear bengt, i am interested in those inputs too. any possibility you can send them to me? thanks a lot, daniel |
|
Back to top |
|
|