Dear list, I am using optim() function to MLE ~55 parameters, but it is very slow to converge (~ 25 min), whereas I can do the same in ~1 sec. using ADMB, and ~10 sec using MS EXCEL Solver. Are there any tricks to speed up? Are there better optimization functions? Thanks Toshihide "Hamachan" Hamazaki, $B_@:j=S=((JPhD Alaska Department of Fish and Game: $B%"%i%9%+=#5y6HLn@8F0J*2](J Diivision of Commercial Fisheries: $B>&6H5y6HIt(J 333 Raspberry Rd. Anchorage, AK 99518 Phone: (907)267-2158 Cell: (907)440-9934 [[alternative HTML version deleted]]
Hamazaki, Hamachan (DFG <toshihide.hamazaki <at> alaska.gov> writes:> > Dear list, > > I am using optim() function to MLE ~55 parameters, but it is very slow toconverge (~ 25 min), whereas I can do> the same in ~1 sec. using ADMB, and ~10 sec using MS EXCEL Solver. > > Are there any tricks to speed up? > > Are there better optimization functions? >There's absolutely no way to tell without knowing more about your code. You might try method="CG": Method ?"CG"? is a conjugate gradients method based on that by Fletcher and Reeves (1964) (but with the option of Polak-Ribiere or Beale-Sorenson updates). Conjugate gradient methods will generally be more fragile than the BFGS method, but as they do not store a matrix they may be successful in much larger optimization problems. If ADMB works better, why not use it? You can use the R2admb package (on R forge) to wrap your ADMB calls in R code, if you prefer that workflow. Ben
As I'm at least partly responsible for CG in optim, and packager of Rcgmin, I'll recommend the latter based on experience since it was introduced. I've so far seen no example where CG does better than Rcgmin, though I'm sure there are cases to be found. However, Ben is right that if ADMB does so well (it uses effectively analytic derivatives), then use it. Rcgmin really wants you to provide gradient code, and that is work. JN On 07/14/2011 06:00 AM, r-help-request at r-project.org wrote:> Message: 85 Date: Wed, 13 Jul 2011 20:20:47 +0000 From: Ben Bolker <bbolker at gmail.com> To: > <r-help at stat.math.ethz.ch> Subject: Re: [R] Very slow optim() Message-ID: > <loom.20110713T221924-956 at post.gmane.org> Content-Type: text/plain; charset="utf-8" > Hamazaki, Hamachan (DFG <toshihide.hamazaki <at> alaska.gov> writes: >> > >> > Dear list, >> > >> > I am using optim() function to MLE ~55 parameters, but it is very slow to > converge (~ 25 min), whereas I can do >> > the same in ~1 sec. using ADMB, and ~10 sec using MS EXCEL Solver. >> > >> > Are there any tricks to speed up? >> > >> > Are there better optimization functions? >> > > There's absolutely no way to tell without knowing more about your code. You > might try method="CG": > > Method ?"CG"? is a conjugate gradients method based on that by > Fletcher and Reeves (1964) (but with the option of Polak-Ribiere > or Beale-Sorenson updates). Conjugate gradient methods will > generally be more fragile than the BFGS method, but as they do not > store a matrix they may be successful in much larger optimization > problems. > > If ADMB works better, why not use it? You can use the R2admb > package (on R forge) to wrap your ADMB calls in R code, if you > prefer that workflow. > > Ben > > >
Possibly Parallel Threads
- Likelihood returning inf values to optim(L-BFGS-B) other options?
- an alternative to R for nonlinear stat models
- RV: Reporting a conflict between ADMB and Rtools on Windows systems
- Query: Could documentation include modernized references?
- Microsoft Access MDB database on Samba share