Dear R-users, My task is to fit two-dimensional density functions to grid data obtained by counting particles within grid cells. By use of the adapt method I get a numerical integral of the density function for each grid cell. By use of the nlm method I can minimize the Log Likelihood function. By nlm iteratively calling adapt it should be possible to estimate the density function parameters. However, the adapt function may change the number of points used for integration per grid cell as the parameters of the density functions change. This may cause (very small) changes in the precision of integration which causes (small) jumps in the Likelihood function. This may be a problem when gradients of the Likelihood function are small close to the minimum. I have inspected the number of points per grid cell used for function evaluation by adapt (minpts). When these do not change from one iteration to the next it is likely that the same points were used. However, a more satisfactory solution might be to fix the points used by adapt for shorter runs (e.g. five iterations). Is this possible? Alternatively, is it possible to output the points used by adapt for repeated use in another integration procedure? This might reduce computation time. Minimization: In some cases I have to estimate 6-8 parameters (when population sizes are included), and the likelihood function may have several local minima. Will the (combined) use of one or more algorithms in the optim method be more efficient than the nlm method? Perhaps this is a nasty task. Any suggestions for solutions are well come! If suitable methods exist in Java the omega-hat R - Java interface would allow the use of these. Can any non-specialist references on these subjects be recommended? Thanks in advance, Karsten email: kdb at kvl.dk -.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.- r-help mailing list -- Read http://www.ci.tuwien.ac.at/~hornik/R/R-FAQ.html Send "info", "help", or "[un]subscribe" (in the "body", not the subject !) To: r-help-request at stat.math.ethz.ch _._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._
Dear R-users, My task is to fit two-dimensional density functions to grid data obtained by counting particles within grid cells. By use of the adapt method I get a numerical integral of the density function for each grid cell. By use of the nlm method I can minimize the Log Likelihood function. By nlm iteratively calling adapt it should be possible to estimate the density function parameters. However, the adapt function may change the number of points used for integration per grid cell as the parameters of the density functions change. This may cause (very small) changes in the precision of integration which causes (small) jumps in the Likelihood function. This may be a problem when gradients of the Likelihood function are small close to the minimum. I have inspected the number of points per grid cell used for function evaluation by adapt (minpts). When these do not change from one iteration to the next it is likely that the same points were used. However, a more satisfactory solution might be to fix the points used by adapt for shorter runs (e.g. five iterations). Is this possible? Alternatively, is it possible to output the points used by adapt for repeated use in another integration procedure? This might reduce computation time. Minimization: In some cases I have to estimate 6-8 parameters (when population sizes are included), and the likelihood function may have several local minima. Will the (combined) use of one or more algorithms in the optim method be more efficient than the nlm method? Perhaps this is a nasty task. Any suggestions for solutions are well come! If suitable methods exist in Java the omega-hat R - Java interface would allow the use of these. Can any non-specialist references on these subjects be recommended? Thanks in advance, Karsten -.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.- r-help mailing list -- Read http://www.ci.tuwien.ac.at/~hornik/R/R-FAQ.html Send "info", "help", or "[un]subscribe" (in the "body", not the subject !) To: r-help-request at stat.math.ethz.ch _._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._
You may want to check out smolyak.quad in the gss package.> Dear R-users, > > My task is to fit two-dimensional density functions to grid data obtained by > counting particles within grid cells. By use of the adapt method I get a nume > rical integral of the density function for each grid cell. By use of the nlm > method I can minimize the Log Likelihood function. By nlm iteratively calling > adapt it should be possible to estimate the density function parameters. > > However, the adapt function may change the number of points used for integrat > ion per grid cell as the parameters of the density functions change. This may > cause (very small) changes in the precision of integration which causes (sma > ll) jumps in the Likelihood function. This may be a problem when gradients of > the Likelihood function are small close to the minimum. I have inspected the > number of points per grid cell used for function evaluation by adapt (minpts > ). When these do not change from one iteration to the next it is likely that > the same points were used. However, a more satisfactory solution might be to > fix the points used by adapt for shorter runs (e.g. five iterations). Is this > possible? > Alternatively, is it possible to output the points used by adapt for repeated > use in another integration procedure? This might reduce computation time. > > Minimization: > In some cases I have to estimate 6-8 parameters (when population sizes are in > cluded), and the likelihood function may have several local minima. > Will the (combined) use of one or more algorithms in the optim method be more > efficient than the nlm method? > > Perhaps this is a nasty task. Any suggestions for solutions are well come! If > suitable methods exist in Java the omega-hat R - Java interface would allow > the use of these. > > Can any non-specialist references on these subjects be recommended? > > Thanks in advance, > Karsten > > > -.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.- > .- > r-help mailing list -- Read http://www.ci.tuwien.ac.at/~hornik/R/R-FAQ.html > Send "info", "help", or "[un]subscribe" > (in the "body", not the subject !) To: r-help-request at stat.math.ethz.ch > _._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._ > ._-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.- r-help mailing list -- Read http://www.ci.tuwien.ac.at/~hornik/R/R-FAQ.html Send "info", "help", or "[un]subscribe" (in the "body", not the subject !) To: r-help-request at stat.math.ethz.ch _._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._