Message/Author 

Prabakar posted on Tuesday, November 08, 2011  7:59 am



How to do an SEM model with panel data, i dont want to do ordinary regression for each panel/category 


When you say panel data, I interpret that as each subject being measured at more than one time point. The analysis depends on whether or not you want to model the development over time, or simply take into account the acrosstime correlations for each subject. In the former case you do longitudinal modeling such as growth modeling. 


Dear Professor, I have a model of 5 measured variables (x1, x2, x3 , x4 and x5) and 3 latent variables (f1 f2 and f3). Obaserved variable are being measured at 7 time point for every of the 750 subjects. x1x5 and f1 and f2 are time specific (ie indicator of performance for each measuring time) F3 is a variable that capture the persistence of the agregate performance of each subject. I am not interested in the effect of time. The purpose is to validate a multidimentional performance construct that evolve over time. !here i use 3 time points F11 BY x11@1 x21* x31* F21 BY F11@1 x11* x21* F12 BY x12@1 x22* x32* F22 BY F12@1 x12* x22* F32 BY F21@1 F22* F13 BY x13@1 x23* x33* F23 BY F13@1 x13* x23* F32 BY F21@1 F22* F23* 


The question is: How should I model this? Best, JeanSamuel 


Your approach seems reasonable. 


Hello, I am trying to run a panel analysis using two waves of longitudinal data. MPlus is installed on my Macbook computer. However, MPlus does not run the analysis (or at least it does not show the usual "working" window I expect when running a complex analysis). Instead, it opens a completely blank output window. Is this an issue with the syntax I am using, or a software problem? Thank you so much for your help! Kathleen 


Please send the input and data to support@statmodel.com. 

Grace Song posted on Monday, December 14, 2015  2:15 am



Hi, Bengt O. Muthen, just add on your comments. You mentioned: The analysis depends on whether or not you want to model the development over time, or simply take into account the acrosstime correlations for each subject. In the former case you do longitudinal modeling such as growth modeling. What if in the later case? I just want to control the unobserved variables for each subject. I would like to conduct SEM with panel data, not looking at the time effect, but to control the unobserved variables for each subject. Pls kindly advice. Thank you! 


You can do that using Type=Complex wiht cluster = id to just adjust the SEs or you can do it using Type=Twolevel where you have random intercepts for the DVs that have variances and covariances estimated on Between and that captures the correlation due to repeated measures within the same individual. 


Hi, Could you please suggest how I can perform panel data regression( I need to take into account the acrosstime correlations for each subject). I saw chapter eight which discusses this issue, however, I am still not confident. I have something like following data ID timeInstant Com Speed Spacing 1 /1 /1 20 / 10 1 /2 /1 22 12 2 /1 /0 26 8 2 /2 /1 28 15 3 /1 /0 23 13 3 /2 /0 22 17 . /. /. . . . /. . /. /. . . Model: Com ON Speed Spacing; I gave it a try but I am not sure I have done it correctly or not: VARIABLE: NAMES ARE ID Time Com Speed Spacing; USEVARIABLES ARE Com Speed Spacing; NOMINAL IS Com; Cluster=ID; within=Speed Spacing; analysis: type = twolevel random; estimator=ml; MODEL: %within% Com ON Speed TimeH; Thanks, Anshuman 


% please ignore the above post Hi, Could you please suggest how I can perform panel data regression( I need to take into account the acrosstime correlations for each subject). I saw chapter eight which discusses this issue, however, I am still not confident. I have something like following data ID timeInstant Com Speed Spacing 1 1 1 20 10 1 2 1 22 12 2 1 0 26 8 2 2 1 28 15 3 1 0 23 13 3 2 0 22 17 . . . . . . . . . . . . Model: Com ON Speed Spacing; I gave it a try but I am not sure I have done it correctly or not: VARIABLE: NAMES ARE ID Time Com Speed Spacing; USEVARIABLES ARE Com Speed Spacing; NOMINAL IS Com; Cluster=ID; within=Speed Spacing; analysis: type = twolevel random; estimator=ml; MODEL: %within% Com ON Speed TimeH; 


Hello, A bit of advice for a longitudinal panel model. I completed the measurement invariance testing... and partial strict invariance was reached. Good fitting models thoughout the invariance testing. When I finally get down to business for adding the autoregressive paths and crosslag paths, I encounter standardized (STDYX) coefficients for some of the autoregressive paths that are greater than 1.00. Have I missed something in specifying constraints? Based on Little (2013) book, I used the fixed factor method of scaling, so the Time 1 constructs are assigned a variance of 1 and intercept of 0. With all the constraints in place, should I be looking at the unstandardized coefficients for the paths between latent variables. That is, are the unstandardized coefficients "standardized" in panel models because the T1 constructs are scaled to 1? Thanks, KL 


Standardized paths greater than 1 can happen with highly correlated variables. For the time 2 factor I assume you are fixing the residual variance at 1, not the full variance. The unstandardized paths are only standardized if all variables involved have (full) variances of 1. 


Thank you! This is all very helpful for my selfteaching of Mplus. Lastly, what is the distinction between fixing Time 2 residual value at 1, versus the full variance? (all I know is the @1 function) My syntax follows that of an example in Little (2013), which uses the fixed factor scaling method to test for measurement invariance. Namely: Pos1@1 Pos2* Pos3*; [Pos1@0 Pos2* Pos3*]; Neg1@1 Neg2* Neg3*; [Neg1@0 Neg2* Neg3*]; Are you saying that once the autoregressive paths are specified, the factors at time 2 and time 3 receive the "@1" as well, to fix their residual variance? 


The statements you show are fine as they are (I assume that Pos and Neg are factors): The factor variances at time 2 and time 3 are not fixed at 1 but are freely estimated as they should be. 

Back to top 