Testing LCA Item Probability Differences PreviousNext
Mplus Discussion > Latent Variable Mixture Modeling >
Message/Author
 Chris Giebe posted on Thursday, February 22, 2018 - 3:52 am
Hi,

I'm running an LCA with 10 binary (yes/no) categorical indicators and 3 latent classes.

I have obtained a nice looking probability plot, but all three classes have very similar low probabilities on 2 of the items.

I'm wondering if those probabilities are significantly different from each other.
If not, I may revise my model and leave out those 2 items.

Is there a way to test this?
 Bengt O. Muthen posted on Thursday, February 22, 2018 - 4:13 pm
One way to investigate this is via variable-specific entropy described in Asparouhov and Muthen (2014b); see

http://www.statmodel.com/techappen.shtml
 Chris Giebe posted on Monday, February 26, 2018 - 5:47 am
Awesome, that was exactly what I was looking for.

I have another question:

In a similar analysis, I'm using the KNOWNCLASS to compare men and women in my sample.

Is there a way to test whether the class indicators are sig. different between the groups? (I.e. to test whether X1 in g1c1 is sig. different from g2c1, etc.)

Thanks in advance.
 Bengt O. Muthen posted on Monday, February 26, 2018 - 11:15 am
Yes, label their means/intercepts and use Model Test to see if they are significantly different.
 Chris Giebe posted on Wednesday, March 07, 2018 - 8:02 am
Hi again,

thanks for your suggestions. So far they have been very helpful, but I have run into another dead end:

I have labeled the class-specific item thresholds and used the MODEL TEST to see it the thresholds differ between the latent classes.

This worked fine, but the order of the latent classes kept on changing with each running of the Wald Test.

So, I fixed the thresholds of the first class to the values I got before running the Wald tests (hoping this would keep the LCs from rotating).
Now I am getting the following error message:

*** ERROR in MODEL TEST command
A parameter label or the constant 0 must appear on the left-hand side of a MODEL TEST statement. Problem with the following:
A1 = A2

Here an excerpt of the syntax for reference:

MODEL:
%OVERALL%

%c#1%
[PLB0118$1@2.068] (a1);
...

%c#2%
[PLB0118$1] (a2);
...

%c#3%
[PLB0118$1] (a3);
...

MODEL TEST:
a1 = a2;
!a1 = a3;
!a2 = a3;
!...


Before fixing the thresholds the MODEL TEST worked fine, but I'm not seeing the problem...

Thanks in Advance for your advice.
 Bengt O. Muthen posted on Wednesday, March 07, 2018 - 3:54 pm
Your latent classes don't should not change by including changing the Wald test; check your input. If you can't find the problem, send the outputs to Support along with your license number.

Don't do threshold fixing.
Back to top
Add Your Message Here
Post:
Username: Posting Information:
This is a private posting area. Only registered users and moderators may post messages here.
Password:
Options: Enable HTML code in message
Automatically activate URLs in message
Action: