sam.e
2010-Aug-29 19:42 UTC
[R] Finding functions of large dataset for numerical integration
Hello everyone, I have been trying to figure out away to integrate under a spline produced by the package tps(fields). As the package does not output functions I am trying to do something similar to the trapezium rule. My data are 3D (x, y & z). I have extracted from the surface output by Tps the values of z at regular intervals so that I have a grid of figures, for example: 1 4 6 6 8 8 3 2 7 7 2 3 4 9 7 1 2 5 6 7 0 1 4 5 6 I was wondering what the best method for working out the functions of the squares between the number corner points is? I have a number of these to do and am struggling to think of a code or where to look that will process my data in this way so that I can then integrate. Any help or advice would be much appreciated! Sam -- View this message in context: http://r.789695.n4.nabble.com/Finding-functions-of-large-dataset-for-numerical-integration-tp2399305p2399305.html Sent from the R help mailing list archive at Nabble.com.
David Winsemius
2010-Aug-29 20:38 UTC
[R] Finding functions of large dataset for numerical integration
On Aug 29, 2010, at 3:42 PM, sam.e wrote:> > Hello everyone, > > I have been trying to figure out away to integrate under a spline > produced > by the package tps(fields). As the package does not output functions > I am > trying to do something similar to the trapezium rule. My data are 3D > (x, y & > z). I have extracted from the surface output by Tps the values of z at > regular intervals so that I have a grid of figures, for example: > > 1 4 6 6 8 > 8 3 2 7 7 > 2 3 4 9 7 > 1 2 5 6 7 > 0 1 4 5 6 > > I was wondering what the best method for working out the functions > of the > squares between the number corner points is? I have a number of > these to do > and am struggling to think of a code or where to look that will > process my > data in this way so that I can then integrate. > > Any help or advice would be much appreciated!Not sure this is the perfect solution since it does not take into account the trapezoidal approximations that might increase its accuracy but here is some code that Chuck Berry offered a couple of months back to improve some code I had offered as a 2D-ECDF function: > ?table > ecdf.tbl2 <- # from Chuck Berry + function(mat) { + mat[is.na(mat)] <- 0 + t( apply( apply( mat,2, cumsum ), 1, cumsum ))} > > indat<- read.table(textConnection("1 4 6 6 8 + 8 3 2 7 7 + 2 3 4 9 7 + 1 2 5 6 7 + 0 1 4 5 6 ") ) > ecdf.tbl2(indat) V1 V2 V3 V4 V5 [1,] 1 5 11 17 25 [2,] 9 16 24 37 52 [3,] 11 21 33 55 77 [4,] 12 24 41 69 98 [5,] 12 25 46 79 114 Maybe it will give you some useful ideas since your system is only 2d. There is also a mecdf package, that might be examined for ideas. I seem to remember reading that the functions in the old package "adapt" had been incorporated elsewhere in the R panoply. Running: ??"multivariate integration" ... makes me think that package cubature might be what I am remembering. -- David Winsemius, MD West Hartford, CT