Displaying 20 results from an estimated 10000 matches similar to: "Query: Could documentation include modernized references?"
2023 Mar 26
2
Query: Could documentation include modernized references?
On 26/03/2023 11:54 a.m., J C Nash wrote:
> A tangential email discussion with Simon U. has highlighted a long-standing
> matter that some tools in the base R distribution are outdated, but that
> so many examples and other tools may use them that they cannot be deprecated.
>
> The examples that I am most familiar with concern optimization and nonlinear
> least squares, but
2023 Mar 31
1
Query: Could documentation include modernized references?
>>>>> Duncan Murdoch
>>>>> on Sun, 26 Mar 2023 12:41:03 -0400 writes:
> On 26/03/2023 11:54 a.m., J C Nash wrote:
>> A tangential email discussion with Simon U. has
>> highlighted a long-standing matter that some tools in the
>> base R distribution are outdated, but that so many
>> examples and other tools may use
2011 Apr 18
3
how to extract options for a function call
Hi, I'm having some difficulties formulating this question.
But what I want,
is to extract the options associated with a parameter for a function.
e.g.
method = c("Nelder-Mead", "BFGS", "CG", "L-BFGS-B", "SANN")
in the optim function.
So I would like to have a vector with
c("Nelder-Mead", "BFGS", "CG",
2011 Jul 11
3
fitdistr() Error
I am trying to estimate a gamma function using real data and I am getting the
following error messages.
When I set a lower limit; the error message is "L-BFGS-B needs finite values of
fn"
?
For other method the error message is:
Error in optim(x = c(0.105286666666667, 0.3472275, 2.057625, 0.329675,? :
? non-finite finite-difference value [1]
The codes works fine for simulated data
2019 Mar 04
2
Package inclusion in R core implementation
As the original coder (in mid 1970s) of BFGS, CG and Nelder-Mead in optim(), I've
been pushing for some time for their deprecation. They aren't "bad", but we have
better tools, and they are in CRAN packages. Similarly, I believe other optimization
tools in the core (optim::L-BFGS-B, nlm, nlminb) can and should be moved to
packages (there are already 2 versions at least of LBFGS
2017 Dec 31
1
Order of methods for optimx
Dear R-er,
For a non-linear optimisation, I used optim() with BFGS method but it
stopped regularly before to reach a true mimimum. It was not a problem
with limit of iterations, just a local minimum. I was able sometimes to
reach better minimum using several rounds of optim().
Then I moved to optimx() to do the different optim rounds automatically
using "Nelder-Mead" and
2012 Jun 09
0
R-devel Digest, Vol 112, Issue 8
I'll not be able to comment on the use of C like this, but will warn that I wrote the
routines that became Nelder-Mead, CG, and BFGS in optim() in the mid 1970s. CG never did
as well as I would like, but the other two routines turned out pretty well. However, in
nearly 40 years, there are a few improvements, particularly in handling bounds and masks
(fixed parameters). For all-R routines see
2007 Jul 29
1
behavior of L-BFGS-B with trivial function triggers bug in stats4::mle
With the exception of "L-BFGS-B", all of the
other optim() methods return the value of the function
when they are given a trivial function (i.e., one with no
variable arguments) to optimize. I don't think this
is a "bug" in L-BFGS-B (more like a response to
an undefined condition), but it leads to a bug in stats4::mle --
a spurious error saying that a better fit
has been
2023 Aug 13
4
Noisy objective functions
While working on 'random walk' applications, I got interested in
optimizing noisy objective functions. As an (artificial) example, the
following is the Rosenbrock function, where Gaussian noise of standard
deviation `sd = 0.01` is added to the function value.
fn <- function(x)
(1+rnorm(1, sd=0.01)) * adagio::fnRosenbrock(x)
To smooth out the noise, define another
2009 Feb 12
1
Setting optimizer in lme
I am using R 2.7.0 on a linux platform.
I am trying to reproduce a 2002 example using lme from the nlme library.
I want to change the otimizer from the default (nlminb) to optim.
Specifically, this is what I am trying to do:
R> library(nlme)
R> library(car) # for data only
R> data(Blackmoor) # from car
R> Blackmoor$log.exercise <- log(Blackmoor$exercise + 5/60, 2)
R>
2012 Nov 28
1
How to change smoothing constant selection procedure for Winters Exponential Smoothing models?
Hello all,
I am looking for some help in understanding how to change the way R
optimizes the smoothing constant selection process for the HoltWinters
function.
I'm a SAS veteran but very new to R and still learning my way around.
Here is some sample data and the current HoltWinters code I'm using:
rawdata <- c(294, 316, 427, 487, 441, 395, 473, 423, 389, 422, 458, 411,
433, 454,
2005 Nov 15
1
An optim() mystery.
I have a Master's student working on a project which involves
estimating parameters of a certain model via maximum likelihood,
with the maximization being done via optim().
A phenomenon has occurred which I am at a loss to explain.
If we use certain pairs of starting values for optim(), it
simply returns those values as the ``optimal'' values, although
they are definitely not
2007 Apr 09
1
R:Maximum likelihood estimation using BHHH and BFGS
Dear R users,
I am new to R. I would like to find *maximum likelihood estimators for psi
and alpha* based on the following *log likelihood function*, c is
consumption data comprising 148 entries:
fn<-function(c,psi,alpha)
{
s1<-sum(for(i in 1:n){(c[i]-(psi^(-1/alpha)*(lag(c[i],-1))))^2*
(lag(c[i],-1)^((-2)*(alpha+1))
)});
s2<- sum(for(m in 1:n){log(lag(c[m],-1)^(((2)*alpha)+2))});
2007 Jan 03
1
optim
Hi!
I'm trying to figure out how to use optim... I get some really strange results, so I guess I got something wrong.
I defined the following function which should be minimized:
errorFunction <- function(localShifts,globalShift,fileName,experimentalPI,lambda)
{
lambda <- 1/sqrt(147)
# error <- abs(errHuber(localShifts,globalShift,
#
2010 Feb 08
2
evolution of Nelder-Mead process
Dear list,
I am looking for an R-only implementation of a Nelder-Mead process that can find local maxima of a spatially distributed variable, e.g. height, on a spatial grid, and outputs the coordinates of the new point during each evaluation. I have found two previous threads about this topic, and was wondering if something similar has been implemented since those messages were posted.
Thank
2010 Mar 05
2
Improved Nelder-Mead algorithm - a potential replacement for optim's Nelder-Mead
Hi,
I have written an R translation of C.T. Kelley's Matlab version of the Nelder-Mead algorithm. This algorithm is discussed in detail in his book "Iterative methods for optimization" (SIAM 1999, Chapter 8). I have tested this relatively extensively on a number of smooth and non-smooth problems. It performs well, in general, and it almost always outperforms optim's
2008 Mar 11
1
messages from mle function
Dears useRs,
I am using the mle function but this gives me the follow erros that I
don't understand. Perhaps there is someone that can help me.
thank you for you atention.
Bernardo.
> erizo <- read.csv("Datos_Stokes_1.csv", header = TRUE)
> head(erizo)
EDAD TALLA
1 0 7.7
2 1 14.5
3 1 16.9
4 1 13.2
5 1 24.4
6 1 22.5
> TAN <-
2010 Dec 03
2
Competing with one's own work
No, this is not about Rcpp, but a comment in that overly long discussion raised a question
that has been in my mind for a while.
This is that one may have work that is used in R in the base functionality and there are
improvements that should be incorporated.
For me, this concerns the BFGS, Nelder-Mead and CG options of optim(), which are based on
the 1990 edition (Pascal codes) of my 1979 book
2009 Dec 10
1
obtain intermediate estimate using optim
Hi,
Currently I am trying to solve a minimization problem using optim as method Nelder-Mead. However, Neldel-Mead needs many iterations until it finally converges. I have set $control.trace and $control.report such that I can see the value of the function at each iteration. I do see that I set the convergence criteria to strict in the sense that the function value does not change much. However,
2006 Jun 12
1
r's optim vs. matlab's fminsearch
Hi,
I'm having a problem converting a Matlab program into R. The R code works
almost all the time, but about 4% of the time R's optim function gets stuck
on a local minimum whereas matlab's fminsearch function does not (or at
least fminsearch finds a better minimum than optim). My understanding is
that both functions default to Nelder-Mead optimization, but what's
different about