I'm trying to fit a model to solve a biological problem. There are multiple independent variables, and also there are multiple responses. Each response is a function of all the independent variables, plus a set of parameters. All the responses depend on the same variables and parameters - just the form of the function changes to define each seperate response. Any ideas how I can fit parameters to my data? nls, as I understand it, will work for multiple independent variables, but not multiple responses. All help gratefully recieved! Sophie -.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.- r-help mailing list -- Read http://www.ci.tuwien.ac.at/~hornik/R/R-FAQ.html Send "info", "help", or "[un]subscribe" (in the "body", not the subject !) To: r-help-request at stat.math.ethz.ch _._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._
Sophie Smout <scs10 at st-andrews.ac.uk> writes:> I'm trying to fit a model to solve a biological problem.> There are multiple independent variables, and also there are multiple > responses.> Each response is a function of all the independent variables, plus a set of > parameters. All the responses depend on the same variables and parameters - > just the form of the function changes to define each seperate response.> Any ideas how I can fit parameters to my data? > nls, as I understand it, will work for multiple independent variables, but > not multiple responses.The Box-Draper criterion for multiresponse parameter estimation is discussed in chapter 4 of Bates and Watts, "Nonlinear Regression Analysis and Its Applications" (Wiley, 1988). There is even a detailed explanation there of a generalized Gauss-Newton algorithm to optimize the criterion. We did not implement that in the nls function for R and, given the state of my "To Do" list, it is not likely to happen soon. If a skilled and energetic R/C programmer would like to undertake this project I would be happy to provide advice on how it could be implemented. -.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.- r-help mailing list -- Read http://www.ci.tuwien.ac.at/~hornik/R/R-FAQ.html Send "info", "help", or "[un]subscribe" (in the "body", not the subject !) To: r-help-request at stat.math.ethz.ch _._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._
Here is a solution that is fairly easy to do. (Not sure how statistically reasonable it is--seems OK to me) Use nlm() or optimize() to minimise the sum of the squared residuals across all DVs simultaneously. If your IVs are a,b,c , your parameters are b0,b1,b2, and your DVs are x,y,z, then nlm() will iteratively minimize (x-fnx(a,b,c,b0,b1,b2))^2+ (y-fny(a,b,c,b0,b1,b2))^2+ (z-fnz(a,b,c,b0,b1,b2))^2 Or I suppose you could write a function that returns a vector xhat, yhat, zhat Here is a simple 1 IV 1 DV example to base your code on: x<-c(0.02, 0.02, 0.06, 0.06, 0.11, 0.11, 0.22, 0.22, 0.56, 0.56, 1.10, 1.10) y<-c(76, 47, 97, 107, 123, 139, 159, 152, 191, 201, 207, 200) fn <- function(p) sum((y - (p[1] * x)/(p[2] + x))^2) out<-nlm(fn,p=c(200,.1),hessian=TRUE) out #SSE = out$minimum; MSE = out$minimum/(n-p), n is #pts, p is #params #estimates=out$estimate se<-sqrt(diag(2*out$minimum/(length(y)-2)*solve(out$hessian))) #SEs Bill -.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.- r-help mailing list -- Read http://www.ci.tuwien.ac.at/~hornik/R/R-FAQ.html Send "info", "help", or "[un]subscribe" (in the "body", not the subject !) To: r-help-request at stat.math.ethz.ch _._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._
Bill Simpson <W.Simpson at gcal.ac.uk> writes:> Here is a solution that is fairly easy to do. (Not sure how statistically > reasonable it is--seems OK to me) > > Use nlm() or optimize() to minimise the sum of the squared residuals > across all DVs simultaneously. If your IVs are a,b,c , your parameters > are b0,b1,b2, and your DVs are x,y,z, then nlm() will iteratively minimize > > (x-fnx(a,b,c,b0,b1,b2))^2+ > (y-fny(a,b,c,b0,b1,b2))^2+ > (z-fnz(a,b,c,b0,b1,b2))^2 > > Or I suppose you could write a function that returns a vector > xhat, yhat, zhat > > Here is a simple 1 IV 1 DV example to base your code on: > > x<-c(0.02, 0.02, 0.06, 0.06, 0.11, 0.11, 0.22, 0.22, 0.56, 0.56, 1.10, > 1.10) > y<-c(76, 47, 97, 107, 123, 139, 159, 152, 191, 201, 207, 200) > fn <- function(p) sum((y - (p[1] * x)/(p[2] + x))^2) > out<-nlm(fn,p=c(200,.1),hessian=TRUE) > out > #SSE = out$minimum; MSE = out$minimum/(n-p), n is #pts, p is #params > #estimates=out$estimate > se<-sqrt(diag(2*out$minimum/(length(y)-2)*solve(out$hessian))) #SEsYou are making assumptions in doing that. You are assuming that the "noise terms" on your responses are independent and have comparable variances. This may not be the case. The Box-Draper criterion is to minimize the square of the product of the singular values of the matrix of the residuals. I would recommend using that instead. -.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.- r-help mailing list -- Read http://www.ci.tuwien.ac.at/~hornik/R/R-FAQ.html Send "info", "help", or "[un]subscribe" (in the "body", not the subject !) To: r-help-request at stat.math.ethz.ch _._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._