I've been running some models using DSEM for N = 1, multiple factors with cross-lags. I've been asking Mplus for standardized coefficients, and it consistently says there was an error computing coefficients. So, my first question is whether that might mean that I'm doing something wrong.
If this is something Mplus just doesn't do automatically right now, I'm wondering the best way to compute standardized coefficients given that I seem to get residual variances for the factors and not variances. Should I perhaps first fit a model in which there are no effects across time and take the variances from that model to calculate approximate standardized coefficients in the crosslagged model? Happy to provide more detail if that would help.