similar to: Define lower-upper bound for parameters in Optim using Nelder-Mead method

Displaying 20 results from an estimated 6000 matches similar to: "Define lower-upper bound for parameters in Optim using Nelder-Mead method"

2010 Mar 05
2
Improved Nelder-Mead algorithm - a potential replacement for optim's Nelder-Mead
Hi, I have written an R translation of C.T. Kelley's Matlab version of the Nelder-Mead algorithm. This algorithm is discussed in detail in his book "Iterative methods for optimization" (SIAM 1999, Chapter 8). I have tested this relatively extensively on a number of smooth and non-smooth problems. It performs well, in general, and it almost always outperforms optim's
2010 Feb 08
2
evolution of Nelder-Mead process
Dear list,   I am looking for an R-only implementation of a Nelder-Mead process that can find local maxima of a spatially distributed variable, e.g. height, on a spatial grid, and outputs the coordinates of the new point during each evaluation. I have found two previous threads about this topic, and was wondering if something similar has been implemented since those messages were posted.   Thank
2009 Oct 10
2
Nelder-Mead with output of simplex vertices
Greetings! I want to follow the evolution of a Nelder-Mead function minimisation (a function of 2 variables). Hence each simplex will have 3 vertices. Therefore I would like to have a function which can output the coordinates of the 3 vertices after each new simplex is generated. However, there seems to be no way (which I can detect) of extracting this information from optim() (the
2005 Mar 18
1
Constrained Nelder-Mead
All, In looking at `optim', it doesn't appear that it is possible to impose nonlinear constraints on Nelder- Mead. I am sufficiently motivated to try to code something in C from scratch and try to call it from R.... Does anyone have some good references to barrier and/or penalization methods for Nelder-Mead? I would ideally like some papers with pseudocode for method(s) that are in
2012 Aug 18
1
Parameter scaling problems with optim and Nelder-Mead method (bug?)
Dear all, I?m having some problems getting optim with method="Nelder-Mead" to work properly. It seems like there is no way of controlling the step size, and the step size seems to depend on the *difference* between the initial values, which makes no sense. Example: f=function(xy, mu1, mu2) { print(xy) dnorm(xy[1]-mu1)*dnorm(xy[2]-mu2) } f1=function(xy) -f(xy, 0,
2005 Mar 23
4
non-derivative based optimization and standard errors.
Hi AlL, I ahve this problem that my objective function is discontinous in the paramaters and I need to use methods such as nelder-mead to get around this. My question is: How do i compute standard errors to a problem that does not have a gradient? Any literature on this is greatly appreciated. Jean,
2005 Nov 15
1
An optim() mystery.
I have a Master's student working on a project which involves estimating parameters of a certain model via maximum likelihood, with the maximization being done via optim(). A phenomenon has occurred which I am at a loss to explain. If we use certain pairs of starting values for optim(), it simply returns those values as the ``optimal'' values, although they are definitely not
2007 Jan 03
1
optim
Hi! I'm trying to figure out how to use optim... I get some really strange results, so I guess I got something wrong. I defined the following function which should be minimized: errorFunction <- function(localShifts,globalShift,fileName,experimentalPI,lambda) { lambda <- 1/sqrt(147) # error <- abs(errHuber(localShifts,globalShift, #
2017 Dec 31
1
Order of methods for optimx
Dear R-er, For a non-linear optimisation, I used optim() with BFGS method but it stopped regularly before to reach a true mimimum. It was not a problem with limit of iterations, just a local minimum. I was able sometimes to reach better minimum using several rounds of optim(). Then I moved to optimx() to do the different optim rounds automatically using "Nelder-Mead" and
2020 Oct 28
2
R optim() function
Hi R-Help, I am using R to do functional outlier detection (using PCA to reduce to 2 dimensions - the functional boxplot methodology used in the Rainbow package), and using Hscv.diag function to calculate the bandwidth matrix where this line of code is run: result <- optim(diag(Hstart), scv.mat.temp, method = "Nelder-Mead", control = list(trace = as.numeric(verbose))) Within the
2012 Oct 23
1
help using optim function
Hi, am very new to R and I've written an optim function, but can't get it to work least.squares.fitter<-function(start.params,gr,low.constraints,high.constraints,model.one.stepper,data,scale,ploton=F) { result<-optim(par=start.params,method=c('Nelder-Mead'),fn=least.squares.fit,lower=low.constraints,upper=high.constraints,data=data,scale=scale,ploton=ploton)
2006 Jun 12
1
r's optim vs. matlab's fminsearch
Hi, I'm having a problem converting a Matlab program into R. The R code works almost all the time, but about 4% of the time R's optim function gets stuck on a local minimum whereas matlab's fminsearch function does not (or at least fminsearch finds a better minimum than optim). My understanding is that both functions default to Nelder-Mead optimization, but what's different about
2007 Jun 22
2
fitCopula
I am using R 2.5.0 on windows XP and trying to fit copula. I see the following code works for some users, however my code crashes on the chol. Any suggestions? > mycop <- tCopula(param=0.5, dim=8, dispstr="ex", df=5) > x <- rcopula(mycop, 1000) > myfit <- fitCopula(x, mycop, c(0.6, 10), optim.control=list(trace=1), method="Nelder-Mead")
2012 Nov 03
2
optim & .C / Crashing on run
Hello, I am attempting to use optim under the default Nelder-Mead algorithm for model fitting, minimizing a Chi^2 statistic whose value is determined by a .C call to an external shared library compiled from C & C++ code. My problem has been that the R session will immediately crash upon starting the simplex run, without it taking a single step. This is strange, as the .C call itself works,
2009 Nov 30
3
Question about output from optim
Dear R-users, I am trying to port to R something that I wrote in Matlab to perform model parameter optimization using the Nelder-Mead simplex method (fminsearch). I read the help on ?optim (which seems to be the way to go) as well as a bunch of posts on the topic, but I would like to make sure about something before I spend to much time trying to reproduce something that is not possible. The
2008 Mar 11
1
messages from mle function
Dears useRs, I am using the mle function but this gives me the follow erros that I don't understand. Perhaps there is someone that can help me. thank you for you atention. Bernardo. > erizo <- read.csv("Datos_Stokes_1.csv", header = TRUE) > head(erizo) EDAD TALLA 1 0 7.7 2 1 14.5 3 1 16.9 4 1 13.2 5 1 24.4 6 1 22.5 > TAN <-
2009 Dec 10
1
obtain intermediate estimate using optim
Hi, Currently I am trying to solve a minimization problem using optim as method Nelder-Mead. However, Neldel-Mead needs many iterations until it finally converges. I have set $control.trace and $control.report such that I can see the value of the function at each iteration. I do see that I set the convergence criteria to strict in the sense that the function value does not change much. However,
2012 Apr 24
1
Use of optim to fit two curves at the same time ?
Dear list, Here is a small example code that use optim and optimize in order to fit two functions. Is it possible to fit two functions (like those two for example) at the same time using optim ... or another function in R ? Thanks Arnaud ###################################################################### ## function 1 x1 <- 1:100 y1 <- 5.468 * x + 3 # + rnorm(100,0, 10) dfxy <-
2008 Jul 29
1
optim fails when using arima
Hi all, I?m using the arima() function to study a time series but it gives me the following error: Error en optim(init[mask], armafn, method = "BFGS", hessian = TRUE, control = optim.control, : non-finite finite-difference value [3] I know that I can change the method of the arima() to "CSS" instead of "ML" but I'm specially interested in using
2011 Nov 29
2
Parameters setting in functions optimization
Good afternoon everybody, I'm quite new in functions optimization on R and, whereas I've read lot's of function descriptions, I'm not sure of the correct settings for function like "optimx" and "nlminb". I'd like to minimize my parameters and the loglikelihood result of the function. My parameters are a mean distance of dispersion and a proportion of