Message/Author |
|
Jon Heron posted on Wednesday, July 26, 2006 - 1:47 am
|
|
|
Hi, I spend much of my time these days keeping busy whilst Mplus does its' thing. I wonder if I might be able to increase efficiency by upgrading bits or all of my PC. Any suggestions? Is it worth throwing more RAM at the problem or getting a better processor? Or should I just be patient and keep up to date with my filing? thanks Jon |
|
Jon Heron posted on Wednesday, July 26, 2006 - 3:39 am
|
|
|
Sorry, I should have read an earlier post. Pl ignore. |
|
|
I don't see the "earlier post" which is referred to in another message in this thread but I have the same question...what is an optimal computer's configurations to process GGMM models? Much thanks for anyone's experience and suggestions. |
|
|
You can find timing trials on a few different computer configurations at: http://www.statmodel.com/download/timingcomp.pdf |
|
|
Dear professors and researchers, I am running mixture analysis which take long time to compute (searching for right number of latent classes). I could access multiple computers (i.e. computer lab), but I am not sure if there would be reasonable solution to divide the task for many computers and then to combine the results. Would it be possible to distribute computing one mixture analysis to multiple computers? I could generate scripts that would execute mplus and retrieve outputs from multiple computers. I don't yet quite understand the spesific logic behind mplus starts etc. but it would be nice to know if this could be done. E.g. 1) using manually generated random starts with OPTSEED option and then 2) just compare loglikelihoods of output solutions and pick the best/replicable. Or am I missing something? I think this topic would be of interest to many people. |
|
|
You can access multiple processors on one computer but not multiple computers using the PROCESSOR options. PROCESSOR (START) will also distribute the random starts over the different processors of one computer. |
|
Back to top |