Hello I am getting memory allocation errors when running a function that uses locfit within a for loop. After 25 or so loops, it gives this error. "Error: cannot allocate vector of size 281250 Kb" Running on linux cluster with a Gb of RAM. Problem never happens on my OS X (less memory). The total data is 130 cols by 5000 rows The first 129 cols are response variables, the 130th is the parameter The function fits a local regression between the 129 variables in the ith row of m[ ] to the 129 variables in 5000 rows after m was fed into 130 different vectors called Var1, .....Var129, and PARAMETER. array <- scan(("DataFile"),nlines=5000) m<-matrix(array,ncol=130,byrow=T) for (i in 1:200) { result<- function(m[i,c(1,....,129)],PARAMETER,cbind(Var1,...,Var129)seq(1,len=50 00),F) } Any ideas on how to avoid this memory allocation problem would be greatly appreciated. Garbage collection? (or is that too slow?) Many Thanks in Advance! Mike Mike Hickerson University of California Museum of Vertebrate Zoology 3101 Valley Life Sciences Building Berkeley, California 94720-3160 USA voice 510-642-8911 cell: 510-701-0861 fax 510-643-8238 mhick@berkeley.edu [[alternative text/enriched version deleted]]
The code sniplet you provided is nowhere near correct or sufficient for anyone to help. Please (re-)read the posting guide and try again. Andy> From: Mike Hickerson > > Hello > > I am getting memory allocation errors when running a function > that uses > locfit within a for loop. After 25 or so loops, it gives this error. > > "Error: cannot allocate vector of size 281250 Kb" > > Running on linux cluster with a Gb of RAM. Problem never > happens on my > OS X (less memory). The total data is 130 cols by 5000 rows > The first 129 cols are response variables, the 130th is the parameter > The function fits a local regression between the 129 > variables in the > ith row of m[ ] to the 129 variables in 5000 rows after m was > fed into > 130 different vectors called Var1, .....Var129, and PARAMETER. > > array <- scan(("DataFile"),nlines=5000) > m<-matrix(array,ncol=130,byrow=T) > > for (i in 1:200) > { > result<- > function(m[i,c(1,....,129)],PARAMETER,cbind(Var1,...,Var129)se > q(1,len=50 > 00),F) > } > > Any ideas on how to avoid this memory allocation problem would be > greatly appreciated. Garbage collection? (or is that too slow?) > > Many Thanks in Advance! > > Mike > > > > > Mike Hickerson > University of California > Museum of Vertebrate Zoology > 3101 Valley Life Sciences Building > Berkeley, California 94720-3160 USA > voice 510-642-8911 > cell: 510-701-0861 > fax 510-643-8238 > mhick at berkeley.edu > > [[alternative text/enriched version deleted]] > > ______________________________________________ > R-help at stat.math.ethz.ch mailing list > https://stat.ethz.ch/mailman/listinfo/r-help > PLEASE do read the posting guide! > http://www.R-project.org/posting-guide.html > > >
Calling gc() before starting a memory-intensive task is normally a good idea, as it helps avoid memory fragmentation (which is possibly a problem in a 32-bit OS, but you did not say). R 2.1.0 beta has some dodges to help, so you may find if helpful to try that out. On Mon, 4 Apr 2005, Mike Hickerson wrote:> Hello > > I am getting memory allocation errors when running a function that uses > locfit within a for loop. After 25 or so loops, it gives this error. > > "Error: cannot allocate vector of size 281250 Kb" > > Running on linux cluster with a Gb of RAM. Problem never happens on my > OS X (less memory). The total data is 130 cols by 5000 rows > The first 129 cols are response variables, the 130th is the parameter > The function fits a local regression between the 129 variables in the > ith row of m[ ] to the 129 variables in 5000 rows after m was fed into > 130 different vectors called Var1, .....Var129, and PARAMETER. > > array <- scan(("DataFile"),nlines=5000) > m<-matrix(array,ncol=130,byrow=T) > > for (i in 1:200) > { > result<- > function(m[i,c(1,....,129)],PARAMETER,cbind(Var1,...,Var129)seq(1,len=50 > 00),F) > } > > Any ideas on how to avoid this memory allocation problem would be > greatly appreciated. Garbage collection? (or is that too slow?) > > Many Thanks in Advance! > > Mike > > > > > Mike Hickerson > University of California > Museum of Vertebrate Zoology > 3101 Valley Life Sciences Building > Berkeley, California? 94720-3160? USA > voice 510-642-8911 > cell: 510-701-0861 > fax 510-643-8238 > mhick at berkeley.edu > > [[alternative text/enriched version deleted]] > > ______________________________________________ > R-help at stat.math.ethz.ch mailing list > https://stat.ethz.ch/mailman/listinfo/r-help > PLEASE do read the posting guide! http://www.R-project.org/posting-guide.html >-- Brian D. Ripley, ripley at stats.ox.ac.uk Professor of Applied Statistics, http://www.stats.ox.ac.uk/~ripley/ University of Oxford, Tel: +44 1865 272861 (self) 1 South Parks Road, +44 1865 272866 (PA) Oxford OX1 3TG, UK Fax: +44 1865 272595