Displaying 20 results from an estimated 332 matches for "meads".
Did you mean:
reads
2010 Mar 05
2
Improved Nelder-Mead algorithm - a potential replacement for optim's Nelder-Mead
Hi,
I have written an R translation of C.T. Kelley's Matlab version of the Nelder-Mead algorithm. This algorithm is discussed in detail in his book "Iterative methods for optimization" (SIAM 1999, Chapter 8). I have tested this relatively extensively on a number of smooth and non-smooth problems. It performs well, in general, and it almost always outperforms optim's
2010 Feb 08
2
evolution of Nelder-Mead process
Dear list,
I am looking for an R-only implementation of a Nelder-Mead process that can find local maxima of a spatially distributed variable, e.g. height, on a spatial grid, and outputs the coordinates of the new point during each evaluation. I have found two previous threads about this topic, and was wondering if something similar has been implemented since those messages were posted.
Thank
2010 May 15
3
multi-homed samba PDC and NetApp filers
We are having a problem getting a NetApp filer to re-join a samba
domain after a move to a new network. The filer worked fine with
samba before the move. Apologies in advance for the long missive.
I've tried the following:
- re-running the CIFS setup program on the filer
- removing the problem filer's samba account, replacing it, and
re-running the setup program on the filer
2002 Oct 28
1
Nealder Meade and nlm
Hello,
I have been using R to fit my data using non-linear least squares. I
have used the optimize routine to minimize the sum of squared errors
(using Nealder Meade optimization routine), but couldn't get the
non-linear model in R to converge to the estimates acheived in a
convergent Neadler Meade routine. It tells me about problems with the
gradient. I was wondering if there is any way
2012 May 01
2
Define lower-upper bound for parameters in Optim using Nelder-Mead method
Dear UseRs,
Is there a way to define the lower-upper bounds for parameters fitted by
optim using the Nelder-Mead method ?
Thanks,
Arnaud
[[alternative HTML version deleted]]
2009 Oct 10
2
Nelder-Mead with output of simplex vertices
Greetings!
I want to follow the evolution of a Nelder-Mead function
minimisation (a function of 2 variables). Hence each simplex
will have 3 vertices.
Therefore I would like to have a function which can output
the coordinates of the 3 vertices after each new simplex
is generated. However, there seems to be no way (which I can
detect) of extracting this information from optim() (the
2017 Dec 31
1
Order of methods for optimx
Dear R-er,
For a non-linear optimisation, I used optim() with BFGS method but it
stopped regularly before to reach a true mimimum. It was not a problem
with limit of iterations, just a local minimum. I was able sometimes to
reach better minimum using several rounds of optim().
Then I moved to optimx() to do the different optim rounds automatically
using "Nelder-Mead" and
2005 Mar 18
1
Constrained Nelder-Mead
All,
In looking at `optim', it doesn't appear that it is
possible to impose nonlinear constraints on Nelder-
Mead. I am sufficiently motivated to try to code
something in C from scratch and try to call it from
R....
Does anyone have some good references to barrier
and/or penalization methods for Nelder-Mead? I would
ideally like some papers with pseudocode for method(s)
that are in
2012 Aug 18
1
Parameter scaling problems with optim and Nelder-Mead method (bug?)
Dear all,
I?m having some problems getting optim with method="Nelder-Mead" to work
properly. It seems like there is no way of controlling the step size,
and the step size seems to depend on the *difference* between the
initial values, which makes no sense. Example:
f=function(xy, mu1, mu2) {
print(xy)
dnorm(xy[1]-mu1)*dnorm(xy[2]-mu2)
}
f1=function(xy) -f(xy, 0,
2007 Jun 22
2
fitCopula
I am using R 2.5.0 on windows XP and trying to fit copula. I see the
following code works for some users, however my code crashes on the
chol. Any suggestions?
> mycop <- tCopula(param=0.5, dim=8, dispstr="ex", df=5)
> x <- rcopula(mycop, 1000)
> myfit <- fitCopula(x, mycop, c(0.6, 10), optim.control=list(trace=1),
method="Nelder-Mead")
2005 Nov 15
1
An optim() mystery.
I have a Master's student working on a project which involves
estimating parameters of a certain model via maximum likelihood,
with the maximization being done via optim().
A phenomenon has occurred which I am at a loss to explain.
If we use certain pairs of starting values for optim(), it
simply returns those values as the ``optimal'' values, although
they are definitely not
2010 Nov 21
1
solve nonlinear equation using BBsolve
Hi r-users,
I would like to solve system of nonlinear equation using BBsolve function and
below is my code. I have 4 parameters and I have 4 eqns.
mgf_gammasum <- function(p)
{
t <- rep(NA, length(p))
mn <- 142.36
vr <- 9335.69
sk <- 0.8139635
kur <- 3.252591
rh <- 0.896
# cumulants
k1 <- p[1]*(p[2]+p[3])
k2 <- p[1]*(2*p[2]*p[3]*p[4] +p[2]^2+p[3]^2)
k3 <-
2007 Jan 03
1
optim
Hi!
I'm trying to figure out how to use optim... I get some really strange results, so I guess I got something wrong.
I defined the following function which should be minimized:
errorFunction <- function(localShifts,globalShift,fileName,experimentalPI,lambda)
{
lambda <- 1/sqrt(147)
# error <- abs(errHuber(localShifts,globalShift,
#
2008 Mar 11
1
messages from mle function
Dears useRs,
I am using the mle function but this gives me the follow erros that I
don't understand. Perhaps there is someone that can help me.
thank you for you atention.
Bernardo.
> erizo <- read.csv("Datos_Stokes_1.csv", header = TRUE)
> head(erizo)
EDAD TALLA
1 0 7.7
2 1 14.5
3 1 16.9
4 1 13.2
5 1 24.4
6 1 22.5
> TAN <-
2007 Nov 04
3
Returning the mock associated with an expectation.
I was reading through the FlexMock docs and noticed the expectation
method .mock, which returns the original mock associated with an
expectation.
It looks really handy for writing nice all-in-one mocks like:
mock_user = mock(''User'').expects(:first_name).returns(''Jonah'').mock
So I started playing around with mocha and found I could actually
already do this!
2020 Oct 28
2
R optim() function
Hi R-Help,
I am using R to do functional outlier detection (using PCA to reduce to 2 dimensions - the functional boxplot methodology used in the Rainbow package), and using Hscv.diag function to calculate the bandwidth matrix where this line of code is run:
result <- optim(diag(Hstart), scv.mat.temp, method = "Nelder-Mead", control = list(trace = as.numeric(verbose)))
Within the
2009 Dec 10
1
obtain intermediate estimate using optim
Hi,
Currently I am trying to solve a minimization problem using optim as method Nelder-Mead. However, Neldel-Mead needs many iterations until it finally converges. I have set $control.trace and $control.report such that I can see the value of the function at each iteration. I do see that I set the convergence criteria to strict in the sense that the function value does not change much. However,
2011 Apr 18
3
how to extract options for a function call
Hi, I'm having some difficulties formulating this question.
But what I want,
is to extract the options associated with a parameter for a function.
e.g.
method = c("Nelder-Mead", "BFGS", "CG", "L-BFGS-B", "SANN")
in the optim function.
So I would like to have a vector with
c("Nelder-Mead", "BFGS", "CG",
2023 Aug 13
4
Noisy objective functions
While working on 'random walk' applications, I got interested in
optimizing noisy objective functions. As an (artificial) example, the
following is the Rosenbrock function, where Gaussian noise of standard
deviation `sd = 0.01` is added to the function value.
fn <- function(x)
(1+rnorm(1, sd=0.01)) * adagio::fnRosenbrock(x)
To smooth out the noise, define another
2012 Oct 23
1
help using optim function
Hi, am very new to R and I've written an optim function, but can't get it to
work
least.squares.fitter<-function(start.params,gr,low.constraints,high.constraints,model.one.stepper,data,scale,ploton=F)
{
result<-optim(par=start.params,method=c('Nelder-Mead'),fn=least.squares.fit,lower=low.constraints,upper=high.constraints,data=data,scale=scale,ploton=ploton)