Message/Author 

Caoshang posted on Saturday, May 07, 2011  11:23 pm



Dear professor: I want to do the 3parameter IRT model: a= item difficulty b= item discrimination c= item ¡°guessing¡± 4parameter model: add d= item 'carelessness' Can you give me some examples or advice of how to write the code to set this up in Mplus.i can't get any useful information ,especially typically examples from the Internet. thank you for your patience. 


Mplus handles only a and b. For this, see Example 5.5. 


Since this post in 2011, has Mplus added features so that it can estimate a 3 parameter logistic model? Thank you for your time. Matt 


It's in development. 

Ali posted on Wednesday, September 21, 2016  9:16 am



Hello I am trying to fit the data in the 3plIRT model and the mixture 3pl model. I read the MplusIRT paper, but I was quite confused. So, I have a few questions. (1) For the parameter parameterization, are the coefficient beta as the same as lambda and tau_1 as the tau in the equation 20(p.5) in the MplusIRT paper? If it is , =a(item discrimination)=lambda, b(item difficulties)=tau_1/lambda, c(guessing parameter)= tau_2 (2) For the codes, I am not pretty sure if my codes are right in the mixture 3pl. I have 32 dichotomous items. In the codes, I fixed both latent classes having mean as 0 and variance 1, but the loadings, and thresholds (i.e. tau_1 and tau_2) were freely estimated. The cods as following USEVARIABLES ARE U1U32; CATEGORICAL = U1U32(3pl); CLASSES = c(2); ANALYSIS: TYPE = MIXTURE; ALGORITHM=INTEGRATION; STARTS = 100 20; processor=2; MODEL:%OVERALL% f BY U1U32*; f@0; f@1; %c#1% [f@0]; f@1; [u1$1u32$1]; [u1$2u32$2]; %c#2% [f@0]; f@1; [u1$1u32$1]; [u1$2u32$2]; 


(1) Yes, beta in (39) is the same as lambda in (20). IRT translations to discrimination a is shown in (21) and difficulty b is shown in (22). The guessing (c) translation is shown in (40). Input looks ok except you forgot to put a bracket around the first instance of f@0. Also, you will likely need priors for the guessing parameters as shown in our IRT document. 

Ali posted on Thursday, September 22, 2016  10:58 am



Thanks! As for setting a prior for a guessing parameter, should I set a prior as the same as you set tau_2 from N(1,1). Suppose I have 32 items and 2 latent classes,then I set each tau_2's prior as N(1,1)for each item and each latent class . Does it mean that I have to set 64 priors from N(1,1)for tau_2 ? 


Right. Use the list function. And try one class first. 

Ali posted on Saturday, September 24, 2016  4:13 am



I tried to set a prior for the guessing parameter in the second class. But,it gave me the warning message"WARNING:THE BEST LOGLIKELIHOOD VALUE WAS NOT REPLICATED.THE SOLUTION MAY NOT BE TRUSTWORTHY DUE TO LOCAL MAXIMA. INCREASE THE NUMBER OF RANDOM STARTS." Could it suggest that mixture 3pl didn't fit my data? And, I should fit the mixture 2pl or mixture 1pl. Here is my codes: MODEL: %OVERALL% f BY U1U32*; [f@0]; f@1; %c#1% [f@0]; f@1; [u1$1u32$1]; [u1$2u32$2]; %c#2% [f@0]; f@1; [u1$1u32$1]; [u1$2u32$2](p1p32); MODEL PRIORS: p1p32~N(1,1); OUTPUT: TECH1 TECH8; 


You need to increase the number of random starts. You did not reach a global solution. Try STARTS = 200 50; or more. The second number should be approximately 1/4 of the first. 

Ali posted on Thursday, September 29, 2016  3:40 am



I used starts =200 50, and Mplus ran 24 hours. But, it has the same warning message"WARNING: THE BEST LOGLIKELIHOOD VALUE WAS NOT REPLICATED. THE SOLUTION MAY NOT BE TRUSTWORTHY DUE TO LOCAL MAXIMA. INCREASE THE NUMBER OF RANDOM STARTS." 

Back to top 