search for: mead

Displaying 20 results from an estimated 331 matches for "mead".

Did you mean: head
2010 Mar 05
2
Improved Nelder-Mead algorithm - a potential replacement for optim's Nelder-Mead
Hi, I have written an R translation of C.T. Kelley's Matlab version of the Nelder-Mead algorithm. This algorithm is discussed in detail in his book "Iterative methods for optimization" (SIAM 1999, Chapter 8). I have tested this relatively extensively on a number of smooth and non-smooth problems. It performs well, in general, and it almost always outperforms optim's...
2010 Feb 08
2
evolution of Nelder-Mead process
Dear list,   I am looking for an R-only implementation of a Nelder-Mead process that can find local maxima of a spatially distributed variable, e.g. height, on a spatial grid, and outputs the coordinates of the new point during each evaluation. I have found two previous threads about this topic, and was wondering if something similar has been implemented since those me...
2010 May 15
3
multi-homed samba PDC and NetApp filers
...CIFS connections from each filer are registered by the samba server and are logged in the file: 0.0.0.0.log Each of the filers moved to a new network. Filer #1 rejoined the domain but filer #2 can't. A tcpdump of the unsuccessful transaction is: 10:42:38.137963 IP gcc-fs1.netbios-ns > mead.netbios-ns: NBT UDP PACKET(137): MULTIHOMED REGISTRATION; REQUEST; UNICAST 10:42:38.138165 IP mead.netbios-ns > gcc-fs1.netbios-ns: NBT UDP PACKET(137): WACK; POSITIVE; RESPONSE; UNICAST 10:42:58.270693 IP mead.netbios-ns > gcc-fs1.netbios-ns: NBT UDP PACKET(137): REGISTRATION; NEGATIVE; RESP...
2002 Oct 28
1
Nealder Meade and nlm
Hello, I have been using R to fit my data using non-linear least squares. I have used the optimize routine to minimize the sum of squared errors (using Nealder Meade optimization routine), but couldn't get the non-linear model in R to converge to the estimates acheived in a convergent Neadler Meade routine. It tells me about problems with the gradient. I was wondering if there is any way to estimate the covariance matrix for the parameters when using the...
2012 May 01
2
Define lower-upper bound for parameters in Optim using Nelder-Mead method
Dear UseRs, Is there a way to define the lower-upper bounds for parameters fitted by optim using the Nelder-Mead method ? Thanks, Arnaud [[alternative HTML version deleted]]
2009 Oct 10
2
Nelder-Mead with output of simplex vertices
Greetings! I want to follow the evolution of a Nelder-Mead function minimisation (a function of 2 variables). Hence each simplex will have 3 vertices. Therefore I would like to have a function which can output the coordinates of the 3 vertices after each new simplex is generated. However, there seems to be no way (which I can detect) of extracting this in...
2017 Dec 31
1
Order of methods for optimx
...but it stopped regularly before to reach a true mimimum. It was not a problem with limit of iterations, just a local minimum. I was able sometimes to reach better minimum using several rounds of optim(). Then I moved to optimx() to do the different optim rounds automatically using "Nelder-Mead" and "BFGS" methods I find a huge time difference using system.time() based on the order of these both methods: > snb # "Nelder-Mead" and "BFGS" ??? user?? system? elapsed 1021.656??? 0.200 1021.695 > sbn # "BFGS" and "Nelder-Mead"...
2005 Mar 18
1
Constrained Nelder-Mead
All, In looking at `optim', it doesn't appear that it is possible to impose nonlinear constraints on Nelder- Mead. I am sufficiently motivated to try to code something in C from scratch and try to call it from R.... Does anyone have some good references to barrier and/or penalization methods for Nelder-Mead? I would ideally like some papers with pseudocode for method(s) that are in some sense optimal for c...
2012 Aug 18
1
Parameter scaling problems with optim and Nelder-Mead method (bug?)
Dear all, I?m having some problems getting optim with method="Nelder-Mead" to work properly. It seems like there is no way of controlling the step size, and the step size seems to depend on the *difference* between the initial values, which makes no sense. Example: f=function(xy, mu1, mu2) { print(xy) dnorm(xy[1]-mu1)*dnorm(xy[2]-mu2) } f1=f...
2007 Jun 22
2
fitCopula
...code works for some users, however my code crashes on the chol. Any suggestions? > mycop <- tCopula(param=0.5, dim=8, dispstr="ex", df=5) > x <- rcopula(mycop, 1000) > myfit <- fitCopula(x, mycop, c(0.6, 10), optim.control=list(trace=1), method="Nelder-Mead") Nelder-Mead direct search function minimizer function value for initial parameters = -1747.582044 Scaled convergence tolerance is 2.6041e-05 Stepsize computed as 1.000000 Error in chol(x, pivot = FALSE) : the leading minor of order 2 is not positive definite Kevin D. Oden e: k...
2005 Nov 15
1
An optim() mystery.
...nable, optimum, and seems to do so consistently --- i.e. it gets essentially the same estimates irrespective of starting values. We have plotted the log likelihood surface and it appears smooth and relatively innocuous. The phenomenon only occurs with the "L-BFGS-B"; the default (Nelder-Mead simplex) method, with a heavy penalty for violating constraints, seems to work just fine. So we can get solutions; it just makes me uneasy that there's this funny going on. Can anyone shed any light on what the problem is? I have enclosed below code to reproduce the phenomenon. It is probab...
2010 Nov 21
1
solve nonlinear equation using BBsolve
...4) BBsolve(par = p0, fn = mgf_gammasum) dfsane(par = p0, fn = mgf_gammasum, control = list(trace = FALSE)) sane(par = p0, fn = mgf_gammasum, control = list(trace = FALSE)) and got the error message: > BBsolve(par = p0, fn = mgf_gammasum) Error in optim(par = par, fn = U, method = "Nelder-Mead", control = list(maxit = 100),  :   function cannot be evaluated at initial parameters Error in optim(par = par, fn = U, method = "Nelder-Mead", control = list(maxit = 100),  :   function cannot be evaluated at initial parameters Error in optim(par = par, fn = U, method = &quot...
2007 Jan 03
1
optim
...R/R1593_filtered.data",3.48)) + # sum(abs(localShifts))*lambda error <- sum(abs(localShifts))*lambda error # return the error to be minimized } Then I call optim: par <- seq(length=9, from=0, by=0) lambda <- 1/sqrt(147) optim(par, errorFunction, gr=NULL, method="Nelder-Mead", hessian=FALSE, globalShift, "/home/sarah/Semesterarbeit/Sequences/R/R1593_filtered.data", experimentalPI=3.48, lambda = lambda) The output is: $par [1] 0.56350964 0.56350964 0.56350964 0.56350964 0.00000000 -0.29515957 [7] 0.00569937 0.32543297 0.18...
2008 Mar 11
1
messages from mle function
...<- sum((LT - TAN(edad,f,c,a,d))^2) / N + logl <- (N/2)*log(sigma) + (sum((LT - TAN(edad,f,c,a,d))^2) / (2*sigma)) + } > ini.pars <- list(f=5.91e-05,c=-0.41732,a=0.009661,d=846.7179) > library(stats4) > erizo.mle <- mle(start= ini.pars, minuslogl = loglike, method="Nelder-Mead", control = list(maxit=1500, trace=TRUE)) Nelder-Mead direct search function minimizer function value for initial parameters = 1159.477620 Scaled convergence tolerance is 1.72776e-05 Stepsize computed as 84.671790 BUILD 5 3165.307359 1159.477620 . . . HI-REDUCTION 303 1158...
2007 Nov 04
3
Returning the mock associated with an expectation.
I was reading through the FlexMock docs and noticed the expectation method .mock, which returns the original mock associated with an expectation. It looks really handy for writing nice all-in-one mocks like: mock_user = mock(''User'').expects(:first_name).returns(''Jonah'').mock So I started playing around with mocha and found I could actually already do this!
2020 Oct 28
2
R optim() function
...do functional outlier detection (using PCA to reduce to 2 dimensions - the functional boxplot methodology used in the Rainbow package), and using Hscv.diag function to calculate the bandwidth matrix where this line of code is run: result <- optim(diag(Hstart), scv.mat.temp, method = "Nelder-Mead", control = list(trace = as.numeric(verbose))) Within the optim function, there is a call to an external C function: .External2(C_optim, par, fn1, gr1, method, con, lower, upper) Where Par = (0.339, 0.339), fn1 = function (diagH) { H <- diag(diagH) %*% diag(diagH) if (default.bflag(d =...
2009 Dec 10
1
obtain intermediate estimate using optim
Hi, Currently I am trying to solve a minimization problem using optim as method Nelder-Mead. However, Neldel-Mead needs many iterations until it finally converges. I have set $control.trace and $control.report such that I can see the value of the function at each iteration. I do see that I set the convergence criteria to strict in the sense that the function value does not change much. Ho...
2011 Apr 18
3
how to extract options for a function call
Hi, I'm having some difficulties formulating this question. But what I want, is to extract the options associated with a parameter for a function. e.g. method = c("Nelder-Mead", "BFGS", "CG", "L-BFGS-B", "SANN") in the optim function. So I would like to have a vector with c("Nelder-Mead", "BFGS", "CG", "L-BFGS-B", "SANN") Or for instance the 'method' in the dist func...
2023 Aug 13
4
Noisy objective functions
...isy and smoothed noise functions we get for instance the following results: (Starting point is always `rep(0.1, 5)`, maximal number of iterations 5000, relative tolerance 1e-12, and the optimization is successful if the function value at the minimum is below 1e-06.) k nmk anms neldermead ucminf optim_BFGS --------------------------------------------------- 1 0.21 0.32 0.13 0.00 0.00 3 0.52 0.63 0.50 0.00 0.00 10 0.81 0.91 0.87 0.00 0.00 Solvers: nmk = dfoptim::nmk, anms = pracma::a...
2012 Oct 23
1
help using optim function
Hi, am very new to R and I've written an optim function, but can't get it to work least.squares.fitter<-function(start.params,gr,low.constraints,high.constraints,model.one.stepper,data,scale,ploton=F) { result<-optim(par=start.params,method=c('Nelder-Mead'),fn=least.squares.fit,lower=low.constraints,upper=high.constraints,data=data,scale=scale,ploton=ploton) return(result) } least.squares.fitter(c(2,2),c(0,0),c(Inf,Inf),ricker.one.step...