Happy new year to all; A few days ago, I posted similar problem. At that time, I found out that our R program had been 32-bit compiled, not 64-bit compiled. So the R program has been re-installed in 64-bit and run the same job, reading in 150 Affymetrix U133A v2 CEL files and perform dChip processing. However, the memory problem happened again. Since the amount of physical memory is 64GB, I think it should not be a problem. Is there anyway we can configure memory usage so that all physical memory can be utilized? Our system is like this: System type: IBM AIX Symmetric Multiprocessing (SMP) OS version: SuSe 8 SP3a CPU: 8 Memory: 64GB The codes are as follows:> Data <- ReadAffy(filenames = paste(HOME, "CelData/", fname, sep="")) > eset <- expresso(Data, normalize.method="invariantset", bg.correct=FALSE, pmc\orrect.method="pmonly", summary.method="liwong") normalization: invariantset PM/MM correction : pmonly expression values: liwong normalizing...Error: cannot allocate vector of size 594075 Kb> gc()used (Mb) gc trigger (Mb) Ncells 797971 21.4 1710298 45.7 Vcells 76716794 585.4 305954055 2334.3 ...> mem.limits()nsize vsize NA NA> object.size(Data)[1] 608355664> memory.profile()NILSXP SYMSXP LISTSXP CLOSXP ENVSXP PROMSXP LANGSXP 1 30484 372383 4845 420 180 127274 SPECIALSXP BUILTINSXP CHARSXP LGLSXP INTSXP 203 1168 111430 5296 0 0 44650 REALSXP CPLXSXP STRSXP DOTSXP ANYSXP VECSXP EXPRSXP 13382 9 60170 0 0 26003 0 BCODESXP EXTPTRSXP WEAKREFSXP 0 106 0
Have you checked whether there are limits set? What does `ulimit -a' say? Do you know how much memory the R process is using when the error occurred? We've had R jobs using upwards of 13GB on a box with 16GB of RAM (SLES8 on dual Opterons) and never had problems. Andy> From: Tae-Hoon Chung > > Happy new year to all; > > A few days ago, I posted similar problem. At that time, I > found out that our > R program had been 32-bit compiled, not 64-bit compiled. So > the R program > has been re-installed in 64-bit and run the same job, reading in 150 > Affymetrix U133A v2 CEL files and perform dChip processing. > However, the > memory problem happened again. Since the amount of physical > memory is 64GB, > I think it should not be a problem. Is there anyway we can > configure memory > usage so that all physical memory can be utilized? > > Our system is like this: > System type: IBM AIX Symmetric Multiprocessing (SMP) > OS version: SuSe 8 SP3a > CPU: 8 > Memory: 64GB > > The codes are as follows: > > Data <- ReadAffy(filenames = paste(HOME, "CelData/", fname, sep="")) > > eset <- expresso(Data, normalize.method="invariantset", > bg.correct=FALSE, pmc\ > orrect.method="pmonly", summary.method="liwong") > normalization: invariantset > PM/MM correction : pmonly > expression values: liwong > normalizing...Error: cannot allocate vector of size 594075 Kb > > gc() > used (Mb) gc trigger (Mb) > Ncells 797971 21.4 1710298 45.7 > Vcells 76716794 585.4 305954055 2334.3 > ... > > mem.limits() > nsize vsize > NA NA > > object.size(Data) > [1] 608355664 > > memory.profile() > NILSXP SYMSXP LISTSXP CLOSXP ENVSXP > PROMSXP LANGSXP > 1 30484 372383 4845 420 > 180 127274 > SPECIALSXP BUILTINSXP CHARSXP LGLSXP > INTSXP > 203 1168 111430 5296 0 > 0 44650 > REALSXP CPLXSXP STRSXP DOTSXP ANYSXP > VECSXP EXPRSXP > 13382 9 60170 0 0 > 26003 0 > BCODESXP EXTPTRSXP WEAKREFSXP > 0 106 0 > > ______________________________________________ > R-help at stat.math.ethz.ch mailing list > https://stat.ethz.ch/mailman/listinfo/r-help > PLEASE do read the posting guide! > http://www.R-project.org/posting-guide.html > >
Tae-Hoon Chung <thchung at tgen.org> writes:> Happy new year to all; > > A few days ago, I posted similar problem. At that time, I found out that our > R program had been 32-bit compiled, not 64-bit compiled. So the R program > has been re-installed in 64-bit and run the same job, reading in 150 > Affymetrix U133A v2 CEL files and perform dChip processing. However, the > memory problem happened again. Since the amount of physical memory is 64GB, > I think it should not be a problem. Is there anyway we can configure memory > usage so that all physical memory can be utilized? > > Our system is like this: > System type: IBM AIX Symmetric Multiprocessing (SMP) > OS version: SuSe 8 SP3a > CPU: 8 > Memory: 64GB.....> expression values: liwong > normalizing...Error: cannot allocate vector of size 594075 Kb > > gc() > used (Mb) gc trigger (Mb) > Ncells 797971 21.4 1710298 45.7As Brian Ripley told you, 64-bit builds of R has 56byte Ncells, so if yours was one, you should have> 797971*56/1024/1024[1] 42.61625 i.e. 42.6Mb used for your Ncells, and it seems that you don't.... -- O__ ---- Peter Dalgaard Blegdamsvej 3 c/ /'_ --- Dept. of Biostatistics 2200 Cph. N (*) \(*) -- University of Copenhagen Denmark Ph: (+45) 35327918 ~~~~~~~~~~ - (p.dalgaard at biostat.ku.dk) FAX: (+45) 35327907
Maybe Matching Threads
- Configuration of memory usage
- Affy Package
- mac os x crashes with bioconductor microarray code (PR#8013)
- Error in function (classes, fdef, mtable): unable to find an inherited method for function "indexProbes", for signature "exprSet", "character"
- A question about memory size