NOT ENOUGH MEMORY SPACE PreviousNext
Mplus Discussion > Structural Equation Modeling >
Message/Author
 Luke Brooks-Shesler posted on Monday, June 11, 2012 - 1:59 pm
My model will not run and I'm receiving this error message:

*** FATAL ERROR
THERE IS NOT ENOUGH MEMORY SPACE TO RUN Mplus ON THE CURRENT
INPUT FILE. THE ANALYSIS REQUIRES 4 DIMENSIONS OF INTEGRATION RESULTING
IN A TOTAL OF 0.50625E+05 INTEGRATION POINTS. THIS MAY BE THE CAUSE
OF THE MEMORY SHORTAGE. YOU CAN TRY TO FREE UP SOME MEMORY BY CLOSING
OTHER APPLICATIONS THAT ARE CURRENTLY RUNNING. NOTE THAT THE MODEL MAY
REQUIRE MORE MEMORY THAN ALLOWED BY THE OPERATING SYSTEM.
REFER TO SYSTEM REQUIREMENTS AT www.statmodel.com FOR MORE
INFORMATION ABOUT THIS LIMIT.

I closed all other programs. My computer has 2.79 GHz and 3.24 GB of RAM.

I have used the Disk Cleanup Utility, and defragmented the hard drive.
 Linda K. Muthen posted on Monday, June 11, 2012 - 2:53 pm
Try adding:

ALGORITHM=INTEGRATION;
INTEGRATION=MONTECARLO (5000);

to the ANALYSIS command.
 Luke Brooks-Shesler posted on Monday, June 11, 2012 - 3:42 pm
I got another error message:

SAMPLE STATISTICS

THE ESTIMATED COVARIANCE MATRIX COULD NOT BE INVERTED.
COMPUTATION COULD NOT BE COMPLETED IN ITERATION 1.
CHANGE YOUR MODEL AND/OR STARTING VALUES.



THE MODEL ESTIMATION DID NOT TERMINATE NORMALLY DUE TO AN ERROR IN THE
COMPUTATION. CHANGE YOUR MODEL AND/OR STARTING VALUES.
 Linda K. Muthen posted on Monday, June 11, 2012 - 4:54 pm
Please send your output and license number to support@statmodel.com.
 Richard E. Zinbarg posted on Friday, January 25, 2013 - 2:41 pm
I am running a survival analysis with several latent variables as predictors. Whether I run the analysis on my laptop (Macbook Air with 4 GB of RAM) or my university's social science computing cluster (which has 62 GB to allocate to my analysis), I get an error message saying:
*** FATAL ERROR
THERE IS NOT ENOUGH MEMORY SPACE TO RUN Mplus ON THE CURRENT
INPUT FILE. THE ANALYSIS REQUIRES 13 DIMENSIONS OF INTEGRATION RESULTING
IN A TOTAL OF 0.19462E+16 INTEGRATION POINTS. THIS MAY BE THE CAUSE
OF THE MEMORY SHORTAGE. YOU CAN TRY TO REDUCE THE NUMBER OF DIMENSIONS
OF INTEGRATION OR THE NUMBER OF INTEGRATION POINTS OR USE INTEGRATION=MONTECARLO

I have run the model with integration=montecarlo but am just wondering if there is any way to know what the memory capacity would have to be to get the model to run without using integration=montecarlo? (As I don't understand the integration=montecarlo option very well, I worry that it might be giving me a different result than what one would get with enough memory to run the model without that option).
Thanks!
 Bengt O. Muthen posted on Friday, January 25, 2013 - 4:15 pm
I don't think computers can handle the memory requirement with 13 dimension, even trying only 5 points per dimension, you will get 5 to the power of 13 which is a lot. It also gets worse by large sample size. The only way is to use

integration=montecarlo(5000);

but the precision in the loglikelihood may not be great.
 Richard E. Zinbarg posted on Sunday, January 27, 2013 - 8:08 pm
ok, thanks for the quick reply Bengt!
 Pasha Malik posted on Friday, August 10, 2018 - 3:33 am
The Most frustrated thing is not having enough space and shortage of space in hard drive or any drive.
I always faced that issue.
 Pasha Malik posted on Thursday, August 16, 2018 - 12:36 am
I have found out the solution of my problem of External Hard drive

Thanks to all of you.
Back to top
Add Your Message Here
Post:
Username: Posting Information:
This is a private posting area. Only registered users and moderators may post messages here.
Password:
Options: Enable HTML code in message
Automatically activate URLs in message
Action: