Uzuner, Tolga
2005-May-05 22:38 UTC
[R] Numerical Derivative / Numerical Differentiation of unkno wn funct ion
Ah... I searched for half an hour for this function... you know, the help function in R could really be a lot better... But wait a minute... looking at this, it appears you have to pass in an expression. What if it is an unknown function, where you only have a handle to the function, but you cannot see it's implementation ? Will this work then ? -----Original Message----- From: Berton Gunter [mailto:gunter.berton at gene.com] Sent: 05 May 2005 23:34 To: 'Uzuner, Tolga'; r-help at stat.math.ethz.ch Subject: RE: [R] Numerical Derivative / Numerical Differentiation of unknown funct ion But... See ?numericDeriv which already does it via a C call and hence is much faster (and probably more accurate,too). -- Bert Gunter Genentech Non-Clinical Statistics South San Francisco, CA "The business of the statistician is to catalyze the scientific learning process." - George E. P. Box> -----Original Message----- > From: r-help-bounces at stat.math.ethz.ch > [mailto:r-help-bounces at stat.math.ethz.ch] On Behalf Of Uzuner, Tolga > Sent: Thursday, May 05, 2005 3:21 PM > To: 'r-help at stat.math.ethz.ch' > Subject: [R] Numerical Derivative / Numerical Differentiation > of unknown funct ion > > Hi, > > I have been trying to do numerical differentiation using R. > > I found some old S code using Richardson Extrapolation which > I managed to get > to work. > > I am posting it here in case anyone needs it. > > > ############################################################## > ########## > richardson.grad <- function(func, x, d=0.01, eps=1e-4, r=6, show=F){ > # This function calculates a numerical approximation of the first > # derivative of func at the point x. The calculation > # is done by Richardson's extrapolation (see eg. G.R.Linfield and > J.E.T.Penny > # "Microcomputers in Numerical Analysis"). The method > should be used if > # accuracy, as opposed to speed, is important. > # > # * modified by Paul Gilbert from orginal code by XINGQIAO LIU. > # CALCULATES THE FIRST ORDER DERIVATIVE > # VECTOR using a Richardson extrapolation. > # > # GENERAL APPROACH > # -- GIVEN THE FOLLOWING INITIAL VALUES: > # INTERVAL VALUE D, NUMBER OF ITERATIONS R, AND > # REDUCED FACTOR V. > # - THE FIRST ORDER aproximation to the DERIVATIVE WITH > RESPECT TO Xi IS > # > # F'(Xi)={F(X1,...,Xi+D,...,Xn) - > F(X1,...,Xi-D,...,Xn)}/(2*D) > # > # -- REPEAT r TIMES with successively smaller D and > # then apply Richardson extraplolation > # > # INPUT > # func Name of the function. > # x The parameters of func. > # d Initial interval value (real) by default set > to 0.01*x or > # eps if x is 0.0. > # r The number of Richardson improvement iterations. > # show If T show intermediate results. > # OUTPUT > # > # The gradient vector. > > v <- 2 # reduction factor. > n <- length(x) # Integer, number of variables. > a.mtr <- matrix(1, r, n) > b.mtr <- matrix(1, (r - 1), n) > #------------------------------------------------------------- > ----------- > # 1 Calculate the derivative formula given in 'GENERAL > APPROACH' section. > # -- The first order derivatives are stored in the matrix > a.mtr[k,i], > # where the indexing variables k for rows(1 to r), i > for columns > # (1 to n), r is the number of iterations, and n is > the number of > # variables. > #------------------------------------------------------------- > ------------ > > h <- abs(d*x)+eps*(x==0.0) > for(k in 1:r) { # successively reduce h > for(i in 1:n) { > x1.vct <- x2.vct <- x > x1.vct[i] <- x[i] + h[i] > x2.vct[i] <- x[i] - h[i] > if(k == 1) a.mtr[k,i] <- (func(x1.vct) - > func(x2.vct))/(2*h[i]) > else{ > if(abs(a.mtr[(k-1),i])>1e-20) > # some functions are unstable near 0.0 > a.mtr[k,i] <- (func(x1.vct)-func(x2.vct))/(2*h[i]) > else a.mtr[k, i] <- 0 > } > } > h <- h/v # Reduced h by 1/v. > } > if(show) { > cat("\n","first order approximations", "\n") > print(a.mtr, 12) > } > > #------------------------------------------------------------- > ----------- > # 1 Applying Richardson Extrapolation to improve the accuracy of > # the first and second order derivatives. The algorithm as follows: > # > # -- For each column of the 1st and 2nd order derivatives > matrix a.mtr, > # say, A1, A2, ..., Ar, by Richardson Extrapolation, to > calculate a > # new sequence of approximations B1, B2, ..., Br used > the formula > # > # B(i) =( A(i+1)*4^m - A(i) ) / (4^m - 1) , i=1,2,...,r-m > # > # N.B. This formula assumes v=2. > # > # -- Initially m is taken as 1 and then the process is repeated > # restarting with the latest improved values and increasing the > # value of m by one each until m equals r-1 > # > # 2 Display the improved derivatives for each > # m from 1 to r-1 if the argument show=T. > # > # 3 Return the final improved derivative vector. > #------------------------------------------------------------- > ------------ > > for(m in 1:(r - 1)) { > for(i in 1:(r - m)) b.mtr[i,]<- > (a.mtr[(i+1),]*(4^m)-a.mtr[i,])/(4^m-1) > # a.mtr<- b.mtr > # a.mtr<- (a.mtr[2:(r+1-m),]*(4^m)-a.mtr[1:(r-m),])/(4^m-1) > if(show & m!=(r-1) ) { > cat("\n","Richarson improvement group No. ", m, "\n") > print(a.mtr[1:(r-m),], 12) > } > } > a.mtr[length(a.mtr)] > } > > ## try it out > richardson.grad(function(x){x^3},2) > ############################################################## > ########################## > > > Regards, > Tolga Uzuner > > > =============================================================> ===============> This message is for the sole use of the intended recipient...{{dropped}}
Peter Dalgaard
2005-May-06 00:51 UTC
[R] Numerical Derivative / Numerical Differentiation of unkno wn funct ion
"Uzuner, Tolga" <tolga.uzuner at csfb.com> writes:> Ah... I searched for half an hour for this function... you know, the > help function in R could really be a lot better... > > But wait a minute... looking at this, it appears you have to pass in > an expression. What if it is an unknown function, where you only > have a handle to the function, but you cannot see it's > implementation ? Will this work then ? > > -----Original Message----- > From: Berton Gunter [mailto:gunter.berton at gene.com] > Sent: 05 May 2005 23:34 > To: 'Uzuner, Tolga'; r-help at stat.math.ethz.ch > Subject: RE: [R] Numerical Derivative / Numerical Differentiation of > unknown funct ion > > > But... > > See ?numericDeriv which already does it via a C call and hence is much > faster (and probably more accurate,too). >The expression passed to numericDeriv can easily be a call to .C or similar. Actually, numericDeriv can get you in trouble if the function is not smooth enough. It basically just calculates (f(a+d)-f(a))/d where d is on the order of 1e-7 * a for each parameter. Sometimes a larger d and a higher order approximation is need to avoid getting stuck in the rough. (Yes, Bill, I do remember that you wanted an R News Programmer's Niche item from me on this...) -- O__ ---- Peter Dalgaard Blegdamsvej 3 c/ /'_ --- Dept. of Biostatistics 2200 Cph. N (*) \(*) -- University of Copenhagen Denmark Ph: (+45) 35327918 ~~~~~~~~~~ - (p.dalgaard at biostat.ku.dk) FAX: (+45) 35327907
Duncan Murdoch
2005-May-06 09:24 UTC
[R] Numerical Derivative / Numerical Differentiation of unkno wn funct ion
Uzuner, Tolga wrote:> Ah... I searched for half an hour for this function... you know, the help function in R could really be a lot better...help() assumes you know the name of the thing you're looking for. You should be using help.search('derivative'), which for me gave width.SJ(MASS) Bandwidth Selection by Pilot Estimation of Derivatives D(stats) Symbolic and Algorithmic Derivatives of Simple Expressions numericDeriv(stats) Evaluate derivatives numerically How could that be better? Duncan Murdoch