Hi R Team I've got the following problem I'd like to run a time series regression of the following form Regression1: At = ? + ?1 * Bt + ?2 * Bt-1 + ?3 [(Bt-2 + Bt-3 + Bt-4)/3] + ?t The B's are the input values and the A's are the output values, the subscript stands for the lag. The real Beta of this regression is ?real = ?1 + ?2 + ?3 First: How can I run the regression without manually laging the B's? And second: I need the standard error for ?real. How can I calculate it with the information given from the lm(Regression1)? (I read something about the deltamethod?) Thank you a lot! Kind regards -- View this message in context: http://r.789695.n4.nabble.com/Time-lag-Regression-and-Standard-Error-tp4675130.html Sent from the R help mailing list archive at Nabble.com.