When fit with WLSMV, I get loading estimate of IND1 (taken as an example) of .796 (.023) [STDYX estimate equal]. When fit with ML with Gauss-Hermite quadrature, however, I get 2.231 (.156) [however: STDYX .776 (.022)].
I believe Mplus fits the "normal ogive GRM" model under either of the estimators. - Why are the unstandardized estimates so different? What changes about the model paramterization and/or standardization between WLSMV and ML?
- When running binary regression switching from WLSMV to ML causes Mplus to switch from probit to logistic regression. Is this also the case here, i.e. does Mplus fit the "logistic GRM" when specifying ML?
2. When I estimate Samejima's GRM model using the package 'ltm' in R, I get item discrimination estimate 2.142 (.149). How should I proceed to transform the Mplus parameters to this GRM parameter? I believe transformation lambda/(1-lambda^2)^(0.5) applies here, but it yields 1.315 for WLSMV (see Muthen & Asparouhov, 2002). This seems too far off.
I would like to add that package ltm claims to fit the logistic GRM model using marginal maximum likelihood with Gauss-Hermite Quadrature. I think I should get at least in the region of 2.142 with the transformed Mplus estimates, even though Mplus uses the normal link function. But my transformed estimate is too for off.
The Mplus default is WLSMV which uses probit. The ML default is logit but you can use link=probit as well. These facts hold for binary as well as ordinal items. ML uses using marginal maximum likelihood with numerical integration.
Transformations between the Mplus factor analysis parameterization and that of IRT are shown in