> From: Martin Maechler <maechler@stat.math.ethz.ch>
> To: r-core@stat.math.ethz.ch, r-devel@stat.math.ethz.ch
> Subject: Re: ``nlm(.) with derivatives''
>
> >>>>> "DougB" == Douglas Bates
<bates@stat.wisc.edu> writes:
>
> DougB> .......
> DougB> ....... { time comparisons in testing lme(..) for R }
> DougB> .......
>
> DougB> This can be expected to run faster when a version of nlm
> DougB> that accepts gradients and Hessians is available.
>
> Doug, do I understand properly that it won't be you
> who will work on this?
>
> Do we have volunteers
> [who also are knowledgable in numerical optimization,..]?
Using gradient and hessian information in the minimizer is a relatively
minor change. BUT, do we want to continue with this particular bit of
code? The code used is by Robert Schnabel and is based on the methods
in
Dennis and Schnabel (1983).
"Numerical Methods for Unconstrained Optimization
and Nonlinear Equations".
Prentice-Hall
I got it from CMLIB and I think it is pretty reputable code (much
better than the recent Applied Statistics one), but I don't know how
close it might be to the state-of-the-art. Anyone know?
To make use of hessian information we would also need to upgrade the
"deriv" function so that the function it returns will also return the
hessian. Again, this is not difficult, but does demand some
familiarity with R internals.
I would be quite keen to see this fixed, but I'm already a bit thinly
spread at present, so it probably won't be me that does it.
[ My major projects remain: contrasts, graphics and internal data
structures + the Mac :-)]
Ross
-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-
r-devel mailing list -- Read http://www.ci.tuwien.ac.at/~hornik/R/R-FAQ.html
Send "info", "help", or "[un]subscribe"
(in the "body", not the subject !) To:
r-devel-request@stat.math.ethz.ch
_._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._