Displaying 7 results from an estimated 7 matches for "stepmax".
2000 Mar 06
1
nlm and optional arguments
...ion that passes the optional
arguments to the objective function), but I presume this would be cleaner
and faster if implemented at a lower level ...
nlm2 <- function(f, p, hessian=FALSE, typsize=rep(1,length(p)),
fscale=1, print.level=0, ndigit=12, gradtol=1e-6,
stepmax=max(1000 * sqrt(sum((p/typsize)^2)), 1000),
steptol=1e-6, iterlim=100, check.analyticals=TRUE, ...) {
tmpf <- function(x) {
f(x,...)
}
nlm(tmpf,p,hessian=hessian,typsize=typsize,fscale=fscale,print.level=print.level,
ndigit=ndigit,gradtol=gradtol,stepmax=stepmax,s...
2010 Oct 13
1
Wierd nlm behaviour in 2.10.1 and 2.12.0 [Sec=Unclassified]
...ll to nlm().
This can be replicated by:
FixedRemovals<-1836180125888
AbStageInitial<-2223033830403
Rates<- 0.3102445
nlm(function(rootM,Abund,Loss,OtherM)
{(Loss-(rootM/(rootM+OtherM)*
(1-exp(-(rootM+OtherM)))*
Abund))^2}
,0.001,print.level=0,fscale=0, gradtol=1E-10, stepmax = 100.0
,Loss=FixedRemovals
,Abund=AbStageInitial
,OtherM=Rates)$estimate
___________________________________________________________________________
Australian Antarctic Division - Commonwealth of Australia
IMPORTANT: This transmission is intended for the addressee only. If yo...
2010 Mar 25
1
*** caught segfault *** address 0x18, cause 'memory not mapped'
...oc.time()
> sink("out22031001.txt")
> > fmainsimbiasP(10000,100)
proc.time()-ptm
*** caught segfault ***
address 0x18, cause 'memory not mapped'
Traceback:
1: nlm(f = fprof_deriv, x = x, p = parHInt, b = parFIX, hessian = T,
iterlim = 2000, check.analyticals = F, stepmax = 10)
2: doTryCatch(return(expr), name, parentenv, handler)
3: tryCatchOne(expr, names, parentenv, handlers[[1]])
4: tryCatchList(expr, classes, parentenv, handlers)
5: tryCatch(expr, error = function(e) { call <- conditionCall(e)
if (!is.null(call)) { if (identical(call[[1]],
quo...
2008 Jan 15
1
Viewing source code for .Internal functions
I am trying to view the source code of the function nlm in the stats
package of R 2.4.1.
I downloaded the source from CRAN and opened nlm.R, and it calls a
.Internal function:
.Internal(nlm(function(x) f(x, ...), p, hessian, typsize, fscale,
msg, ndigit, gradtol, stepmax, steptol, iterlim))
This is the same thing I saw when entering the function name at the R
command prompt. Where will I find the actual code?
Thanks.
2005 Oct 11
2
Sometimes having problems finding a minimum using optim(), optimize(), and nlm() (while searching for noncentral F parameters)
...erent R function.
optimize(f=Low.Lim.NC.F, lower=LL.0, upper=50, maximum=FALSE, tol=tol,
alpha.lower=alpha.lower, F.value=F.value, df.1=df.1, df.2=df.2)
# Try to accomplish the same task with a different R function.
nlm(f=Low.Lim.NC.F, p=LL.0, fscale=1,
print.level = 0, ndigit=12, gradtol = 1e-6,
stepmax = max(1000 * sqrt(sum((LL.0/10)^2)), 1000),
steptol = 1e-6, iterlim = 1000, check.analyticals = TRUE,
alpha.lower=alpha.lower, F.value=F.value, df.1=df.1, df.2=df.2)
# The answer in each case is 3.0725. Thus, a noncentral F with
# 5 and 200 df with a noncentrality parameter 3.0725 has at its .975...
1999 Feb 08
0
Constrained minimisation
Hi
Apart from nlm() with the stepmax= argument, is there any other >1
dimensional minimisation function where I can limit the step size
dynamically. With the NAG routine E04NBF (I think) I used to bound the
size of the next step using a*arctan(x/a) tricks where a function was
particularly difficult near a boundary and thus 's...
2000 Dec 07
0
Tuning the nlm function
Hi Everyone,
Is there a simple way to force nlm to take larger initial steps? Setting
print.level = 2 allows me to inspect the step size at each iteration, but
I appear not to have made any appreciable impact on it by changing values
of typsize, fscale, steptol or stepmax. The steps repeatedly come out
tiny, 1e-9 typically, and the algorithm is terminating not because the
gradient is zero (it is not, according to the numerical values) but
because the function value is not changing between iterations. Am I
missing something obvious?
Many thanks, Jonathan.
Jonatha...