Hello: Below is a toy logistic regression problem. When I wrote my own code, Newton-Raphson converged in three iterations using both the gradient and the Hessian and the starting values given below. But I can't get nlm() to work! I would much appreciate any help. > x [1] 10.2 7.7 5.1 3.8 2.6 > y [1] 9 8 3 2 1 > n [1] 10 9 6 8 10 derfs4=function(b,x,y,n) { b0 = b[1] b1 = b[2] c=b0+b1*x d=exp(c) p=d/(1+d) e=d/(1+d)^2 f = -sum(log(choose(n,y))-n*log(1+d)+y*c) attr(f,"gradient")=c(-sum(y-n*p),-sum(x*(y-n*p))) attr(f,"hessian")=matrix(c(sum(n*e),sum(n*x*e),sum(n*x*e),sum(n*x^2*e)),2,2) return(f) } > nlm(derfs4,c(-3.9,.64),hessian=T,print.level=2,x=x,y=y,n=n) Error in choose(n, y) : argument "n" is missing, with no default > I tried a variety of other ways too. When I got it to work it did not converge in 100 iterations ;rather like the fgh example given on the lme help page. Mervyn [[alternative HTML version deleted]] ______________________________________________ R-help at stat.math.ethz.ch mailing list https://stat.ethz.ch/mailman/listinfo/r-help PLEASE do read the posting guide http://www.R-project.org/posting-guide.html and provide commented, minimal, self-contained, reproducible code.
Change the name of n to something else. Also please using spacing in your code for readability.> x <- c(10.2, 7.7, 5.1, 3.8, 2.6) > y <- c(9, 8, 3, 2, 1) > n. <- c(10, 9, 6, 8, 10) > > > derfs4 <- function(b, x, y, n.)+ { + b0 <- b[1] + b1 <- b[2] + c=b0 +b1 * x + d <- exp(c) + p= d/(1+d) + e <- d/(1+d)^2 + f <- - sum(log(choose(n., y)) - n. * log(1 + d) + y * c) + attr(f, "gradient") <- c(- sum(y - n. * p), -sum(x * (y - n. * p))) + attr(f, "hessian") <- matrix(c(sum(n. * e), sum(n. * x * e), + sum(n. * x * e), sum(n. * x^2 * e)), 2, 2) + return(f) + }> > > nlm(derfs4, c(-3.9, 0.64), hessian = TRUE, x = x, y = y, n. = n.)$minimum [1] 5.746945 $estimate [1] -3.5380638 0.6505037 $gradient [1] -0.031425476 -0.006637742 $hessian [,1] [,2] [1,] 5.963954 31.15266 [2,] 31.152658 192.53408 $code [1] 4 $iterations [1] 100 On 3/2/07, Mervyn G Marasinghe <mervyn at iastate.edu> wrote:> Hello: > > Below is a toy logistic regression problem. When I wrote my own code, > Newton-Raphson converged in three iterations using both the gradient > and the Hessian and the starting values given below. But I can't > get nlm() to work! I would much appreciate any help. > > > x > [1] 10.2 7.7 5.1 3.8 2.6 > > y > [1] 9 8 3 2 1 > > n > [1] 10 9 6 8 10 > > > derfs4=function(b,x,y,n) > { > b0 = b[1] > b1 = b[2] > c=b0+b1*x > d=exp(c) > p=d/(1+d) > e=d/(1+d)^2 > f = -sum(log(choose(n,y))-n*log(1+d)+y*c) > attr(f,"gradient")=c(-sum(y-n*p),-sum(x*(y-n*p))) > attr(f,"hessian")=matrix(c(sum(n*e),sum(n*x*e),sum(n*x*e),sum(n*x^2*e)),2,2) > return(f) > } > > > > nlm(derfs4,c(-3.9,.64),hessian=T,print.level=2,x=x,y=y,n=n) > Error in choose(n, y) : argument "n" is missing, with no default > > > I tried a variety of other ways too. When I got it to work it did not > converge in 100 iterations ;rather like the fgh example given on the lme > help page. > > Mervyn > > > [[alternative HTML version deleted]] > > ______________________________________________ > R-help at stat.math.ethz.ch mailing list > https://stat.ethz.ch/mailman/listinfo/r-help > PLEASE do read the posting guide http://www.R-project.org/posting-guide.html > and provide commented, minimal, self-contained, reproducible code. > > ______________________________________________ > R-help at stat.math.ethz.ch mailing list > https://stat.ethz.ch/mailman/listinfo/r-help > PLEASE do read the posting guide http://www.R-project.org/posting-guide.html > and provide commented, minimal, self-contained, reproducible code. >
Here's a partial answer to your question. The argument ``n'' is getting lost because it ``matches'' one of the formal arguments to nlm, namely ``ndigit''. If you change the argument list of ``derfs4'' to ``b,x,y,size'' (and replace references to ``n'' in the body of derfs4 by references to ``size'') then nlm() will work (?) --- but it seems to take 337 iterations, r.t. 3 (!!!) to converge. No idea what the problem is. cheers, Rolf Turner ===+===+===+===+===+===+===+===+===+===+===+===+===+===+===+===+===+===+==Original message: Mervyn G Marasinghe wrote:> Below is a toy logistic regression problem. When I wrote my own code, > Newton-Raphson converged in three iterations using both the gradient > and the Hessian and the starting values given below. But I can't > get nlm() to work! I would much appreciate any help. > > > x > [1] 10.2 7.7 5.1 3.8 2.6 > > y > [1] 9 8 3 2 1 > > n > [1] 10 9 6 8 10 > > > derfs4=function(b,x,y,n) > { > b0 = b[1] > b1 = b[2] > c=b0+b1*x > d=exp(c) > p=d/(1+d) > e=d/(1+d)^2 > f = -sum(log(choose(n,y))-n*log(1+d)+y*c) > attr(f,"gradient")=c(-sum(y-n*p),-sum(x*(y-n*p))) > attr(f,"hessian")=matrix(c(sum(n*e),sum(n*x*e),sum(n*x*e),sum(n*x^2*e)),2,2) > return(f) > } > > > > nlm(derfs4,c(-3.9,.64),hessian=T,print.level=2,x=x,y=y,n=n) > Error in choose(n, y) : argument "n" is missing, with no default > > > I tried a variety of other ways too. When I got it to work it did not > converge in 100 iterations ;rather like the fgh example given on the lme > help page.