Hi, I'm programming in R and below is a summary of a generalized linear model: ************************************************** *** Call: glm(formula = offspring ~ degdays, family = quasi(link = "log", variance "mu"), data = fecundity) Deviance Residuals: Min 1Q Median 3Q Max -0.76674 -0.29117 -0.09664 0.15668 1.00800 Coefficients: Estimate Std. Error t value Pr(>|t|) (Intercept) -1.131351 0.030480 -37.12 <2e-16 *** degdays -0.008803 0.000299 -29.44 <2e-16 *** --- Signif. codes: 0 '***' 0.001 '**' 0.01 '*' 0.05 '.' 0.1 ' ' 1 (Dispersion parameter for quasi family taken to be 0.09437092) Null deviance: 266.29 on 1674 degrees of freedom Residual deviance: 164.54 on 1673 degrees of freedom AIC: NA Number of Fisher Scoring iterations: 6 ************************************************** * For my modelling purposes, I understand that my mean is exp(-1.131351-.008803*degdays). What I want to do is to draw a random number from a lognormal distribution with this mean. What is my variance? Is it the dispersion parameter * mean? Or exp(dispersion parameter * mean)? Or am I completely offbase? Thanks. -- View this message in context: http://www.nabble.com/question-on-dispersion-parameter-tp18242697p18242697.html Sent from the R help mailing list archive at Nabble.com.