Hi all. Are there any R functions around that do quick logistic regression with a Gaussian prior distribution on the coefficients? I just want posterior mode, not MCMC. (I'm using it as a step within an iterative imputation algorithm.) This isn't hard to do: each step of a glm iteration simply linearizes the derivative of the log-likelihood, and, at this point, essentially no effort is required to augment the data to include the prior information. I think this can be done by going inside the glm.fit() function--but if somebody's already done it, that would be a relief! Thanks. Andrew -- Andrew Gelman Professor, Department of Statistics Professor, Department of Political Science gelman at stat.columbia.edu www.stat.columbia.edu/~gelman Statistics department office: Social Work Bldg (Amsterdam Ave at 122 St), Room 1016 212-851-2142 Political Science department office: International Affairs Bldg (Amsterdam Ave at 118 St), Room 731 212-854-7075 Mailing address: 1255 Amsterdam Ave, Room 1016 Columbia University New York, NY 10027-5904 212-851-2142 (fax) 212-851-2164
I don't know of anything. A brief search using RSiteSearch("Bayesian logistic regression") and RSiteSearch("Bayesian regression") led me to the BMA package plus several MCMC solutions (coda, MCMCpack, and BayesCslogistic {cslogistic}). If it were my problem, I might spend a few minutes with BMA and then probably write my own. I would like to see a (possibly singular) multivariate normal (or normal + inverse gamma) "prior" as an optional argument for lm and glm and when present would produce the obvious "posterior" [exact for lm and approximate for glm] as an attribute of the output. A few years ago, wrote something to do this that would do ordinary least squares one step at a time and get the standard OLS answer (starting from a noninformative norma + inverse gamma prior). From this, it is a short step to Kalman filtering: Just add an appropriate "decay" function to increase the uncertainty to convert the posterior at one step into the prior for the next. I'm sure this didn't help much other than confirm that your own search did not overlook something obvious. Best Wishes, Spencer Graves Andrew Gelman wrote:> Hi all. > Are there any R functions around that do quick logistic regression with > a Gaussian prior distribution on the coefficients? I just want > posterior mode, not MCMC. (I'm using it as a step within an iterative > imputation algorithm.) This isn't hard to do: each step of a glm > iteration simply linearizes the derivative of the log-likelihood, and, > at this point, essentially no effort is required to augment the data to > include the prior information. I think this can be done by going inside > the glm.fit() function--but if somebody's already done it, that would be > a relief! > Thanks. > Andrew >