I was going to say ``Why not just use glm()?'', but when I tried the
example given in the original message I got a different but similarly
nervous-making warning:
Warning in eval(expr, envir, enclos) : non-integer #successes
in a binomial glm!
Looking into the code I found that the warning originates in
binomial()$initialize in the lines:
m <- weights * y
if (any(abs(m - round(m)) > 0.001))
warning("non-integer #successes in a binomial glm!")
I also noticed that if y is given as a two column matrix (successes,
and failures) then the check for non-integer values in y gets done
without multiplying anything by the weights, and so y passes the
check and no warning is issued. I.e.
f1 <- glm(y~x,weights=w,family=binomial)
causes a warning, but
f2 <- glm(cbind(y,1-y)~x,weights=w,family=binomial)
does not. The fits f1 and f2 appear to be the same, although they
differ in the number of iterations, and by an order of e-8 in the
coefficients and the scaled and unscaled covariance.
So is that warning which arises in the ``f1'' case actually
appropriate?
cheers,
Rolf Turner
rolf at math.unb.ca
===+===+===+===+===+===+===+===+===+===+===+===+===+===+===+===+===+===+==
Original message:
> I tried lrm in library(Design) but there is always some error
> message. Is this function really doing the weighted logistic
> regression as maximizing the following likelihood:
>
> \sum w_i*(y_i*\beta*x_i-log(1+exp(\beta*x_i)))
>
> Does anybody know a better way to fit this kind of model in R?
>
> FYI: one example of getting error message is like:
> > x=runif(10,0,3)
> > y=c(rep(0,5),rep(1,5))
> > w=rep(1/10,10)
> > fit=lrm(y~x,weights=w)
> Warning message:
> currently weights are ignored in model validation and
> bootstrapping lrm fits in: lrm(y ~ x, weights = w)
>
> although the model can be fit, the above output
> warning makes me uncomfortable. Can anybody explain
> about it a little bit?
>
> Best wishes,
> Feixia