Hee-Jin Jun posted on Tuesday, October 23, 2007 - 11:05 am
I am working on trajectory analysis. Outcome variable is binary and we have 18 time point (age 0 to 17). Last three time points (age 15,16,17) we have some systematic missings (some kids haven't reached that age yet). First I did it with the fixed time points (0 to 17), but it doesn't seem to fit well. So I tried to add quadratic term. Unfortunately, it took quite a long time (few days) and then crashed (When I check the program, program was ended without output file.) Next, I freed the time points except the first two. One day later, I also had the same results (program ended with no output file).
I am wondeing if this is a known bug or new one for you and how can I get the results. Because there is no output file, I don't even get any error message.
Hee-Jin Jun posted on Tuesday, October 30, 2007 - 9:26 am
Running Mplus with a large datset with many time points has been challenging. While I am running a program with 12 time points (N=around 6300), I found that the running time has increased dramatically. E.g. one iteration took about 7500 seconds, which is about 2 hours, for a day since yesterday. I am wondering this is some kind of sign that the program has some difficulties, so that it's better to close and try it again. Thank you for your help.
e.g. 86 -0.60855213D+05 0.0010999 0.0000000 EM 7472.53 79127.1 87 -0.60855211D+05 0.0014535 0.0000000 EM 7457.41 184.5
Actually I may have got it to run but I have a follow-up question. My data is normally distributed which is why I originally went with ML but I got it to run using MLR and TYPE=RANDOM instead of ML. Is it ok to use MLR anyway in this context to get it to run?
I would want to understand why it did not run with ML. You may have different convergence criteria using RANDOM ML and MLR. You can check the outputs. It is never a good idea to leave an analysis without a clear understanding of the problem.
Yep that was my concern, I didn't know why it wouldn't run with ML. Ok, I will prepare the input and data to send to you and will try to write a bit about the steps I have/plan to take in case I have missed something important.