Displaying 20 results from an estimated 6000 matches similar to: "numericDeriv"
2005 Nov 16
2
numericDeriv
I have to compute some standard errors using the delta
method and so have to use the command "numericDeriv"
to get the desired gradient. Befor using it on my
complicated function, I've done a try with a simple
exemple :
x <- 1:5
numericDeriv(quote(x^2),"x")
and i get :
[1] 1 8 27 64 125 216
attr(,"gradient")
[,1] [,2] [,3] [,4] [,5] [,6]
[1,] Inf
2005 May 05
2
Numerical Derivative / Numerical Differentiation of unknown funct ion
Hi,
I have been trying to do numerical differentiation using R.
I found some old S code using Richardson Extrapolation which I managed to get
to work.
I am posting it here in case anyone needs it.
########################################################################
richardson.grad <- function(func, x, d=0.01, eps=1e-4, r=6, show=F){
# This function calculates a numerical approximation
2005 Sep 25
2
getting variable length numerical gradient
Hi all.
I have a numerical function f(x), with x being a vector of generic
size (say k=4), and I wanna take the numerically computed gradient,
using deriv or numericDeriv (or something else).
My difficulties here are that in deriv and numericDeric the function
is passed as an expression, and one have to pass the list of variables
involved as a char vector... So, it's a pure R programming
2006 Mar 12
2
Numerical Derivatives in R
Hi,
Suppose I have an arbitrary function:
arbfun<-function(x) {...}
Is there a robust implementation of a numerical derivative routine in R
which I can use to take it's derivative ? Something a bit more than
simple division by delta of the difference of evaluating the function at
x and x+delta...
Perhaps there is a way to do this using D or deriv but I could not
figure it out.
2003 Apr 25
2
AW: numericDeriv and ecdf
> On only ten points, what did you expect ? Even with 1000
> observations, estimating a density is difficult, and has
> been the subject of a century of research. Kernel density
> estimates are among the most successful. For your immediate
> application, try plot(density(rnorm(10)), type="l"), etc.
wait, you misunderstood me!
I'd like to see 10 or 9 points with
2012 May 18
1
Help for numericDeriv function
Hi,
I am stuck on something for a couple days, I am almost about to give up.
This looks simple, but I can't figure out. I hope I can get some help here.
I am trying to do some symbolic and numerical derivations. Let me explain
the problem. Let's say, I have a matrix as follows:
> load <- matrix(c(3,0,1,4,1,3),nrow=3,ncol=2,byrow=TRUE)
>
> load
[,1] [,2]
[1,] 3 0
2007 Feb 13
1
nls: "missing value or an infinity" (Error in numericDeriv) and "singular gradient matrix"Error in nlsModel
Hi,
I am a non-expert user of R. I am essaying the fit of two different functions to my data, but I receive two different error messages. I suppose I have two different problems here... But, of which nature? In the first instance I did try with some different starting values for the parameters, but without success.
If anyone could suggest a sensible way to proceed to solve these I would be
2020 Jun 15
2
numericDeriv alters result of eval in R 4.0.1
Dear R developers,
I've run into a weird behavior of the numericDeriv function (from the stats
package) which I also posted on StackOverflow (question has same title as
this email, except for the version of R).
Running the code bellow we can see that the numericDeriv function gives an
error as the derivative of x^a wrt a is x^a * log(x) and log is not defined
for negative numbers. However,
2020 Jun 16
1
[External] numericDeriv alters result of eval in R 4.0.1
Dear all
As far as I could trace, looking at the function C function numeric_deriv,
this unwanted behavior comes from the inner most loop in, at the very end
of the function,
for(i = 0, start = 0; i < LENGTH(theta); i++) {
for(j = 0; j < LENGTH(VECTOR_ELT(pars, i)); j++, start += LENGTH(ans)) {
SEXP ans_del;
double origPar, xx, delta;
origPar = REAL(VECTOR_ELT(pars, i))[j];
2003 Mar 26
1
nls
Hi,
df <- read.table("data.txt", header=T);
library(nls);
fm <- nls(y ~ a*(x+d)^(-b), df, start=list(a=max(df->y,na.rm=T)/2,b=1,d=0));
I was using the following routine which was giving Singular Gradient, Error in
numericDeriv(form[[3]], names(ind), env) :
Missing value or an Infinity produced when evaluating the model errors.
I also tried the
2011 Sep 02
5
Hessian Matrix Issue
Dear All,
I am running a simulation to obtain coverage probability of Wald type
confidence intervals for my parameter d in a function of two parameters
(mu,d).
I am optimizing it using "optim" method "L-BFGS-B" to obtain MLE. As, I
want to invert the Hessian matrix to get Standard errors of the two
parameter estimates. However, my Hessian matrix at times becomes
2006 Jan 19
1
numericDeriv() giving a vector when multiple variables input
R Help List --
I have defined two time-series-vector-valued-functions, let them be f and g,
and want to find the numeric derivative of f with respect to the variable x
where f depends on x through g:
(d/dx)(f (g(x) )
Moreover, x is a vector
I tried this out the long way (naming every element of the x vector and then
making the 'theta' argument in numericDeriv() the character vector of
2003 Jul 16
2
numerical differentiation in R? (for optim "SANN" parscale)
Dear R users,
I am running a maximum likelihood model with optim. I chose the
simulated annealing method (method="SANN").
SANN is not performing bad, but I guess it would be much more effecive
if I could set the `parscale' parameter.
The help sais:
`parscale' A vector of scaling values for the parameters.
Optimization is performed on `par/parscale' and these
2003 Apr 25
1
numericDeriv and ecdf
Hi All,
following expression:
x <- sort(rnorm(10)); e <- ecdf(x); d <- numericDeriv(e(x),"x");
makes d far from approximation of one dimensional pdf.
What's wrong then here?
Kind regards.
---------------------------------------------------------------------------
Valery A.Khamenya
Bioinformatics Department
BioVisioN AG, Hannover
2005 Dec 04
1
Understanding nonlinear optimization and Rosenbrock's banana valley function?
GENERAL REFERENCE ON NONLINEAR OPTIMIZATION?
What are your favorite references on nonlinear optimization? I like
Bates and Watts (1988) Nonlinear Regression Analysis and Its
Applications (Wiley), especially for its key insights regarding
parameter effects vs. intrinsic curvature. Before I spent time and
money on several of the refences cited on the help pages for "optim",
2007 May 29
1
Estimate Fisher Information by Hessian from OPTIM
Dear All,
I am trying to find MLE by using "OPTIM" function.
Difficult in differentiating some parameter in my objective function, I
would like to use the returned hessian matrix to yield an estimate of
Fisher's Information matrix.
My question: Since the hessian is calculated by numerical differentiate, is
it a reliable estimate? Otherwise I would have to do a lot of work to
2003 Oct 17
2
nlm, hessian, and derivatives in obj function?
I've been working on a new package and I have a few questions regarding the
behaviour of the nlm function. I've been (for better or worse) using the nlm
function to fit a linear model without suppling the hessian or gradient
attributes in the objective function. I'm curious as to why the nlm requires
31 iterations (for the linear model), and then it doesn't work when I try to
add
2008 May 23
3
nls diagnostics?
Hi, All:
What tools exist for diagnosing singular gradient problems with
'nls'? Consider the following toy example:
DF1 <- data.frame(y=1:9, one=rep(1,9))
nlsToyProblem <- nls(y~(a+2*b)*one, DF1, start=list(a=1, b=1),
control=nls.control(warnOnly=TRUE))
Error in nlsModel(formula, mf, start, wts) :
singular gradient matrix at initial
2001 Oct 09
1
PROC MIXED user trying to use (n)lme...
Dear R-users
Coming from a proc mixed (SAS) background I am trying to get into
the use of (n)lme.
In this connection, I have some (presumably stupid) questions
which I am sure someone out there can answer:
1) With proc mixed it is easy to get a hold on the estimated
variance parameters as they can be put out into a SAS data set.
How do I do the same with lme-objects? For example, I can see the
2007 Mar 02
2
nlm() problem : extra parameters
Hello:
Below is a toy logistic regression problem. When I wrote my own code,
Newton-Raphson converged in three iterations using both the gradient
and the Hessian and the starting values given below. But I can't
get nlm() to work! I would much appreciate any help.
> x
[1] 10.2 7.7 5.1 3.8 2.6
> y
[1] 9 8 3 2 1
> n
[1] 10 9 6 8 10
derfs4=function(b,x,y,n)
{