similar to: optim control trace=-1 gives more output than trace=0 (PR#2691)

Displaying 20 results from an estimated 20000 matches similar to: "optim control trace=-1 gives more output than trace=0 (PR#2691)"

2007 Feb 23
1
optim(method="L-BFGS-B") abnormal termination
Hi, my call of optim() with the L-BFGS-B method ended with the following error message: ERROR: ABNORMAL_TERMINATION_IN_LNSRCH Further tracing shows: Line search cannot locate an adequate point after 20 function and gradient evaluations final value 0.086627 stopped after 7 iterations Could someone pls tell me whether it is possible to increase the limit of 20 evaluations? Is it even worth
2003 Nov 06
2
Search engine fails entirely (PR#4966)
Full_Name: Robert King Version: 1.8.0 OS: linux (debian) Submission from: (NULL) (134.148.20.33) Also seen in 1.7 Start R, Start html help (mozilla), go to search engine, try searching for mean No results given. Mozilla javascript console says: Error: document.SearchEngine.search is not a function Source File: file:///tmp/Rtmp22901/.R/doc/html/search/SearchEngine.html Line: 32 line 32 is
2000 Sep 26
2
bounds violations, infinite loops in optim/L-BFGS-B (PR#671)
I'm having some trouble with optim(method="L-BFGS-B"), and I'm not sure I have the ability to track down and fix what seem to be bugs within optim(). I'm bootstrapping an original data set and fitting a model to each bootstrapped data set. For some bootstrapped samples, optim() sets negative parameter values (despite the fact that I have explicitly set non-zero lower
2005 Aug 26
1
passing arguments from nnet to optim
Hi everyone, According to R reference manual, the nnet function uses the BFGS method of optim to optimize the neural network parameters. I would like, when calling the function nnet to tell the optim function not to produce the tracing information on the progress of the optimization, or at least to reduce the frequency of the reports. I tried the following: a) nnet default > x<-rnorm(20)
2009 Feb 24
2
Tracing gradient during optimization
Hi everyone, I am currently using the function optim() to maximize/minimize functions and I would like to see more output of the optimization procedure, in particular the numerical gradient of the parameter vector during each iteration. The documentation of optim() describes that the trace parameter should allow one to trace the progress of the optimization. I use the following command:
2010 Jan 12
3
optim: abnormal termination in lnsrch (resend)
[sorry, forgot some details...] I'm using optim(param, fun, method='L-BFGS-B', lower=lo, upper=up) to minimize a certain function. Often the minimization ends with the message: ERROR: ABNORMAL_TERMINATION_IN_LNSRCH What is optim() trying to say? What have I to change in my function to make the minimization succeed? Do you think using BBoptim() instead of optim() changes anything?
2003 Dec 18
1
NUMERIC DERIVATE
UseRs, I used the optim function valor.optim <- optim(c(1,1,1),logexp1,method ="BFGS",control=list(fnscale=-1),hessian=T); and I want to calculate the derivates, psi1<-valor.optim$par[1] psi2<-valor.optim$par[2] psi3<-valor.optim$par[3] a0=exp(psi1); a1=exp(psi2)/(20+exp(psi2)+exp(psi3)); a2=exp(psi3)/(20+exp(psi2)+exp(psi3))
2003 Feb 01
1
Trouble with optim
I am having trouble with optim. It claims to have converged to a minimum, yet it has in the course of the optimization visited many points which are closer to optimal. I would be grateful for any explanation of this behaviour. I'm trying to estimate the parameters in the model X ~ Binomial(1,p) * NegBin(mu,theta). So I define a log likelihood function, and invoke optim thus: o <- optim
2008 May 30
1
Get all X iterations in optim output when controls(trace=6)
Hi, I would like to get all X iterations in optim output in matrix form. I know about the follow approach: sink("reportOptim") optim( ......., control=list( trace=6,..........) ) sink() all_iterOptim <- readLines("reportOptim") unlink("reportOptim") all_iterOptim <- all_iterOptim[ grep( '^X', all_iterOptim ) ] ### TODO: the rest !!! :-) But it is very
2012 Aug 05
1
Possible bug with MCMCpack metropolis sampler
Hi, I'm having issues with what I believe is a bug in the MCMCpack's MCMCmetrop1R function. I have code that basically looks like this: posterior.sampler <- function(data, prior.mu){ log.posterior <- function(theta) log.likelihood(data, theta) + log.prior(prior.mu, theta) post.samples <- MCMCmetrop1R(log.posterior, theta.init=prior.mu, burnin=100, mcmc=1000, thin=40,
2001 Sep 25
3
Error in optim(p, fun,...)
All: I am getting an error code from the optimization function. The code is Error in optim(p,fun.LLike, lower=low, upper = up, method = "L-BFGS-B", : non-finite finite-difference value [0] If I add a trace=6 option to my control list the last message before this error is: At X0, 0 variables are exactly at the bounds Any ideas on where I should start would be
2008 Jul 21
1
Control parameter of the optim( ): parscale
Hi everybody, I am using the L-BFGS-B method of the mle2() function to estimate the values of 6 parameters. mle2 uses the methods implemented in optim. As I got it from the descriptions available online, one can use the parscale parameter to tell R somehow what the values of the estimated parameters should be . . . Could somebody please help me understand what one has to do actually with the
2018 Apr 17
1
Minor glitch in optim()
Having worked with optim() and related programs for years, it surprised me that I haven't noticed this before, but optim() is inconsistent in how it deals with bounds constraints specified at infinity. Here's an example: # optim-glitch-Ex.R x0<-c(1,2,3,4) fnt <- function(x, fscale=10){ yy <- length(x):1 val <- sum((yy*x)^2)*fscale } grt <- function(x, fscale=10){ nn
2010 Sep 30
3
how to avoid NaN in optim()
hi , lik <- function(nO, nA, nB, nAB){ loglik <- function(par) { p=par[1] q=par[2] r <- 1 - p - q if (c(p,q,r) > rep(0,3) && c(p,q,r) < rep(1,3) ) { -(2 * nO * log (r) + nA * log (p^2 + 2 * p * r) + nB * log (q^2 + 2 * q * r) + nAB * (log(2) +log(p) +log(q))) } else NA } loglik }
2005 Nov 11
1
optim not giving correct minima
Hello, I am trying to use optim() on a function involving a summation. My function basically is a thinned poisson likelihood. I have two parameters and in most cases optim() does a fine job of getting the minima. I am simulating my data based on pre specified parameters, so I know what I should be getting. However when my true parameters fall in a particular range, optim() gives
2008 Apr 15
1
disturbing seed dependence in optim L-BFGS-B method
The the use of optim with the L-BFGS-B method for the following simple function gives erroneous results. Any help appreciated! Best, Bob Reilly # Code: V=function(p){ p1=p[1];p2=p[2] y=p1*p2-.4*(p1+p2) return(-y)} p=c(.2,.2) # p=c(.8,.8) max=optim(p,V,method = "L-BFGS-B",lower=c(0,0),upper=c(1,1)) max1=optim(max$par,V,method = "L-BFGS-B",lower=c(0,0),upper=c(1,1))
2003 Oct 20
3
'optim' and extra argument to the objective function
Hello, I'd like to use optim, and give extra arguments to the objective function. The man page says that the '...' should let one do it, but I have a hard time to understand how. Example: x <- 1:10 y <- rnorm(10) cost.f <- function(par, x, y) { A <- par[1] cost <- sum( (log(A*x) - log(y))^2) return(cost) } optim(3, cost.f, x, y) ## returns: Error in pmatch(x,
2015 Sep 17
1
names treatment in optim()
Dear both, I have found that names are not treated in the same way in optim() depending on the optimization method (argument method). The example below shows the difference between the Brent method and the L-BFGS-B method. f <- function(x){ y <- x^2;names(y) <-"f(x)";y} optim(10, f, method="Brent", lower=-1, upper=10)$value optim(10, f, method="L-BFGS-B",
2008 Jul 29
1
optim fails when using arima
Hi all, I?m using the arima() function to study a time series but it gives me the following error: Error en optim(init[mask], armafn, method = "BFGS", hessian = TRUE, control = optim.control, : non-finite finite-difference value [3] I know that I can change the method of the arima() to "CSS" instead of "ML" but I'm specially interested in using
2008 Sep 29
0
Logistic Regression using optim() give "L-BFGS-B" error, please help
Sorry, I deleted my old post. Pasting the new query below. Dear R Users/Experts, I am using a function called logitreg() originally described in MASS (the book 4th Ed.) by Venebles & Ripley, p445. I used the code as provided but made couple of changes to run a 'constrained' logistic regression, I set the method = "L-BFGS-B", set lower/upper values for the variables. Here