Elementary calculus problem: You estimate
y(A,B) = b0+b1*A+b11*A^2+b2*B+b22*B^2.
The first partial derivatives are as follows:
dy/dA = b1+2*b11*A
dy/dB = b2+2*b22*B
Set the first derivatives to 0 and solve for A and B. This will be a
unique global minimum if and only if b11>0 and b22>0.
What you did will work more or less in many situations with functions
for which the calculus is not as simple or is even unavailable. If the
function you want to minimize is more complex, you might consider
"optim".
If you want confidence intervals, you could use the so-called "delta
method" if you know some calculus (or can find someone who does). Or you
could a Monte Carlo with what you did.
hope this helps.
spencer graves
Bart Joosen wrote:
> Hi,
>
> I have a function of the second grade, with 2 parameters:
> y~A^2 + A + B^2 + B
>
> The response y is a measurement for the precision of the analytical method,
where A en B are method parameters. As its neccesary to keep the precision of
the analytical methad as good as possible, its usefull to optimize A en B to
keep y as low as possible.
> But how can I do this with R?
> I have searched the archives, did some search work in the help function (
optimize, nlm, nls, ...)
> but could find anything that looks like what I need.
> I have written a script which does the work, but I doubt this is the
easiest way.
>
> Here are some data and the script:
> A<- rep(c(1,4,8),3)
> B<- rep(c(1,3,6),each=3)
> C <- c(3,2,3,2,1,2,3,2,3)
> fit <- lm(C~I(A^2)+A+I(B^2)+B)
>
> Now to optimize:
> new <- data.frame(A=rep( seq(0, 8, 0.5),each=17),B=rep(seq(0, 8,
0.5),17))
> new$C <- predict(fit, new)
> new[which.min(new$C),]
>
> This give me the values for A en B, where C is minimized.
> Is there another way?
>
> Kind regards
>
> Bart
> [[alternative HTML version deleted]]
>
> ______________________________________________
> R-help at stat.math.ethz.ch mailing list
> https://stat.ethz.ch/mailman/listinfo/r-help
> PLEASE do read the posting guide!
http://www.R-project.org/posting-guide.html