Hi, all; I know there has been a lot of discussions on memory usage in R. However, I have some odd situation here. Basically, I have a rare opportunity to run R in a system with 64GB memory without any limit on memory usage for any person or process. However, I encountered the memory problem error message like this: Error: cannot allocate vector of size 594075 Kb I got this error message while I was trying to apply dChip preprocessing procedures for 150 Affymetrix U133v2 chips that has > 22,000 probe sets on them. The actual codes I ran was like this:> Data <- ReadAffy(filenames = paste(HOME, "CelData/", fname, sep="")) > mem.limits()nsize vsize NA NA> gc()used (Mb) gc trigger (Mb) Ncells 530216 14.2 899071 24.1 Vcells 76196137 581.4 243983468 1861.5> eset <- expresso(Data, normalize.method="invariantset", bg.correct=FALSE, pmc\orrect.method="pmonly", summary.method="liwong") normalization: invariantset PM/MM correction : pmonly expression values: liwong normalizing...Error: cannot allocate vector of size 594075 Kb> gc()used (Mb) gc trigger (Mb) Ncells 797983 21.4 1710298 45.7 Vcells 76716811 585.4 305954068 2334.3> object.size(Data)[1] 608355664> memory.profile()NILSXP SYMSXP LISTSXP CLOSXP ENVSXP PROMSXP LANGSXP 1 30484 372373 4845 420 180 127274 SPECIALSXP BUILTINSXP CHARSXP LGLSXP INTSXP 203 1168 111434 5296 0 0 44649 REALSXP CPLXSXP STRSXP DOTSXP ANYSXP VECSXP EXPRSXP 13382 9 60173 0 0 26002 0 BCODESXP EXTPTRSXP WEAKREFSXP 0 106 0 Although I have no idea of memory allocation in R, apparently something's wrong with this. The memory problem must have nothing to do with physical memory. My question is this. Is this memory problem due to some non-optimal configuration of memory usage? If so, then what will be the optimal configuration for this? If not, then there must be problems on actual implementations of functions I used here, right? The reason I am asking this is that, according to the reference manual, the error message I got can be brought up by roughly three reasons. First, when the system is unable to provide the R requested memory. Second, when the requested memory size exceeds the address-space limit for a process. Finally, when the length of a vector is larger than 2^31-1. I wonder the problem has anything to do with the third case. (If so, then I think I am hopeless unless the internal implementations change...)
Tae-Hoon Chung <thchung at tgen.org> writes:> Hi, all; > > I know there has been a lot of discussions on memory usage in R. > However, I have some odd situation here. Basically, I have a rare > opportunity to run R in a system with 64GB memory without any limit on > memory usage for any person or process. However, I encountered the memory > problem error message like this: > > Error: cannot allocate vector of size 594075 Kb....> Although I have no idea of memory allocation in R, apparently something's > wrong with this. The memory problem must have nothing to do with physical > memory. My question is this. Is this memory problem due to some non-optimal > configuration of memory usage? If so, then what will be the optimal > configuration for this? If not, then there must be problems on actual > implementations of functions I used here, right? The reason I am asking this > is that, according to the reference manual, the error message I got can be > brought up by roughly three reasons. First, when the system is unable to > provide the R requested memory. Second, when the requested memory size > exceeds the address-space limit for a process. Finally, when the length of a > vector is larger than 2^31-1.Hmm, the length issue should not kick in before the length exceeds 2 billion or so and you are not beyond 75 or 150 million (counting 8 or 4 bytes per elements).> I wonder the problem has anything to do with > the third case. (If so, then I think I am hopeless unless the internal > implementations change...)Well, revolutionaries often find themselves just below the cutting edge... Just a sanity check: this is using a 64-bit compiled R on a 64-bit operating system, right? -- O__ ---- Peter Dalgaard Blegdamsvej 3 c/ /'_ --- Dept. of Biostatistics 2200 Cph. N (*) \(*) -- University of Copenhagen Denmark Ph: (+45) 35327918 ~~~~~~~~~~ - (p.dalgaard at biostat.ku.dk) FAX: (+45) 35327907
Your lack of knowledge extends to the R posting guide: please consult it before posting. 1) Do not post to two lists! I've removed the BioC list. 2) Do tell us your system details. Looks like you have a 32-bit version of R (from the size of the Ncells), and you need a 64-bit version to make use of more than about 3Gb, so your results seem completely consistent with the limits of your build of R (rather than of R). On Tue, 28 Dec 2004, Tae-Hoon Chung wrote:> Hi, all; > > I know there has been a lot of discussions on memory usage in R. > However, I have some odd situation here. Basically, I have a rare > opportunity to run R in a system with 64GB memory without any limit on > memory usage for any person or process. However, I encountered the memory > problem error message like this: > > Error: cannot allocate vector of size 594075 Kb > > I got this error message while I was trying to apply dChip preprocessing > procedures for 150 Affymetrix U133v2 chips that has > 22,000 probe sets on > them. The actual codes I ran was like this: > >> Data <- ReadAffy(filenames = paste(HOME, "CelData/", fname, sep="")) >> mem.limits() > nsize vsize > NA NA >> gc() > used (Mb) gc trigger (Mb) > Ncells 530216 14.2 899071 24.1 > Vcells 76196137 581.4 243983468 1861.5 >> eset <- expresso(Data, normalize.method="invariantset", bg.correct=FALSE, pmc\ > orrect.method="pmonly", summary.method="liwong") > normalization: invariantset > PM/MM correction : pmonly > expression values: liwong > normalizing...Error: cannot allocate vector of size 594075 Kb >> gc() > used (Mb) gc trigger (Mb) > Ncells 797983 21.4 1710298 45.7 > Vcells 76716811 585.4 305954068 2334.3 >> object.size(Data) > [1] 608355664 >> memory.profile() > NILSXP SYMSXP LISTSXP CLOSXP ENVSXP PROMSXP LANGSXP > 1 30484 372373 4845 420 180 127274 > SPECIALSXP BUILTINSXP CHARSXP LGLSXP INTSXP > 203 1168 111434 5296 0 0 44649 > REALSXP CPLXSXP STRSXP DOTSXP ANYSXP VECSXP EXPRSXP > 13382 9 60173 0 0 26002 0 > BCODESXP EXTPTRSXP WEAKREFSXP > 0 106 0 > > Although I have no idea of memory allocation in R, apparently something's > wrong with this. The memory problem must have nothing to do with physical > memory. My question is this. Is this memory problem due to some non-optimal > configuration of memory usage? If so, then what will be the optimal > configuration for this? If not, then there must be problems on actual > implementations of functions I used here, right? The reason I am asking this > is that, according to the reference manual, the error message I got can be > brought up by roughly three reasons. First, when the system is unable to > provide the R requested memory. Second, when the requested memory size > exceeds the address-space limit for a process. Finally, when the length of a > vector is larger than 2^31-1. I wonder the problem has anything to do with > the third case. (If so, then I think I am hopeless unless the internal > implementations change...)-- Brian D. Ripley, ripley at stats.ox.ac.uk Professor of Applied Statistics, http://www.stats.ox.ac.uk/~ripley/ University of Oxford, Tel: +44 1865 272861 (self) 1 South Parks Road, +44 1865 272866 (PA) Oxford OX1 3TG, UK Fax: +44 1865 272595
Apparently Analagous Threads
- Memory problem ... Again
- Affy Package
- mac os x crashes with bioconductor microarray code (PR#8013)
- Error in function (classes, fdef, mtable): unable to find an inherited method for function "indexProbes", for signature "exprSet", "character"
- error with ReadAffy()