Hello. By way of background, I am running out of memory when attempting to
normalize the data from 160 affymetrix microarrays using justRMA (from the affy
package). This is despite making 6 gigabytes of swap space available on our sgi
irix machine (which has 2 gigabytes of ram). I have seen in various discussions
statements such as "you will need at least 6 gigabytes of memory to
normalize that many chips", but my question is this:
I cannot set the memory limits of R (1.9.1) higher than 4 gigabytes as
attempting to do so results in this message:
WARNING: --max-vsize=4098M=4098`M': too large and ignored
I experience this both on my windows box (on which I cannot allocate more than 4
gigabytes of swap space anyway), and on an the above mentioned sgi irix machine
(on which I can). In view of that, I do not see what good it does to make >
4 gigabytes of ram+swap space available. Does this mean 4 gigabytes is the
absolute upper limit of R's memory usage...or perhaps 8 gigabytes since you
can set both the stack and the heap size to 4 gigabytes?
Thanks,
Eric
This email message, including any attachments, is for the so...{{dropped}}
Did you compile R as 64-bit executable on the Irix? If not, R will be subjected to the 4GB limit of 32-bit systems. Search the archive for `Opteron' and you'll see that the limit is not 4GB, for 64-bit executables. Andy> From: Kort, Eric > > Hello. By way of background, I am running out of memory when > attempting to normalize the data from 160 affymetrix > microarrays using justRMA (from the affy package). This is > despite making 6 gigabytes of swap space available on our sgi > irix machine (which has 2 gigabytes of ram). I have seen in > various discussions statements such as "you will need at > least 6 gigabytes of memory to normalize that many chips", > but my question is this: > > I cannot set the memory limits of R (1.9.1) higher than 4 > gigabytes as attempting to do so results in this message: > > WARNING: --max-vsize=4098M=4098`M': too large and ignored > > I experience this both on my windows box (on which I cannot > allocate more than 4 gigabytes of swap space anyway), and on > an the above mentioned sgi irix machine (on which I can). In > view of that, I do not see what good it does to make > 4 > gigabytes of ram+swap space available. Does this mean 4 > gigabytes is the absolute upper limit of R's memory > usage...or perhaps 8 gigabytes since you can set both the > stack and the heap size to 4 gigabytes? > > Thanks, > Eric > > > This email message, including any attachments, is for the > so...{{dropped}} > > ______________________________________________ > R-help at stat.math.ethz.ch mailing list > https://www.stat.math.ethz.ch/mailman/listinfo/r-help > PLEASE do read the posting guide! > http://www.R-project.org/posting-guide.html > >
>From: Liaw, Andy [mailto:andy_liaw at merck.com] > >Did you compile R as 64-bit executable on the Irix? If not, R will be >subjected to the 4GB limit of 32-bit systems. >No...>Search the archive for `Opteron' and you'll see that the limit is not 4GB, >for 64-bit executables. > >AndyExcellent. I will recompile and try again. Thanks, Eric>> From: Kort, Eric >> >> Hello. By way of background, I am running out of memory when >> attempting to normalize the data from 160 affymetrix >> microarrays using justRMA (from the affy package). This is >> despite making 6 gigabytes of swap space available on our sgi >> irix machine (which has 2 gigabytes of ram). I have seen in >> various discussions statements such as "you will need at >> least 6 gigabytes of memory to normalize that many chips", >> but my question is this: >> >> I cannot set the memory limits of R (1.9.1) higher than 4 >> gigabytes as attempting to do so results in this message: >> >> WARNING: --max-vsize=4098M=4098`M': too large and ignored >> >> I experience this both on my windows box (on which I cannot >> allocate more than 4 gigabytes of swap space anyway), and on >> an the above mentioned sgi irix machine (on which I can). In >> view of that, I do not see what good it does to make > 4 >> gigabytes of ram+swap space available. Does this mean 4 >> gigabytes is the absolute upper limit of R's memory >> usage...or perhaps 8 gigabytes since you can set both the >> stack and the heap size to 4 gigabytes? >> >> Thanks, >> Eric >> >>This email message, including any attachments, is for the so...{{dropped}}
Tae-Hoon Chung
2004-Jul-01 23:52 UTC
[R] Absolute ceiling on R's memory usage = 4 gigabytes?
Hi, Eric. It seems a little bit puzzling to me. Which Affymetrix chip do you use? The reason I'm asking this is that yesterday I was able to normalize 150 HU-133A CEL files (containing 22283 probes) using R 1.9.1 in Mac OS X 10.3.3 with 1.5 GB memory. If your chip has more probes than this, then it must be understandable ... On Jul 1, 2004, at 2:59 PM, Kort, Eric wrote:> Hello. By way of background, I am running out of memory when > attempting to normalize the data from 160 affymetrix microarrays using > justRMA (from the affy package). This is despite making 6 gigabytes > of swap space available on our sgi irix machine (which has 2 gigabytes > of ram). I have seen in various discussions statements such as "you > will need at least 6 gigabytes of memory to normalize that many > chips", but my question is this: > > I cannot set the memory limits of R (1.9.1) higher than 4 gigabytes as > attempting to do so results in this message: > > WARNING: --max-vsize=4098M=4098`M': too large and ignored > > I experience this both on my windows box (on which I cannot allocate > more than 4 gigabytes of swap space anyway), and on an the above > mentioned sgi irix machine (on which I can). In view of that, I do > not see what good it does to make > 4 gigabytes of ram+swap space > available. Does this mean 4 gigabytes is the absolute upper limit of > R's memory usage...or perhaps 8 gigabytes since you can set both the > stack and the heap size to 4 gigabytes? > > Thanks, > Eric > > > This email message, including any attachments, is for the > so...{{dropped}} > > ______________________________________________ > R-help at stat.math.ethz.ch mailing list > https://www.stat.math.ethz.ch/mailman/listinfo/r-help > PLEASE do read the posting guide! > http://www.R-project.org/posting-guide.html > >Tae-Hoon Chung, Ph.D Post-doctoral Research Fellow Molecular Diagnostics and Target Validation Division Translational Genomics Research Institute 1275 W Washington St, Tempe AZ 85281 USA Phone: 602-343-8724
Yes, we are using the HGU-133plus2 chips with 50,000+ probes, and I suppose that
the memory requirements increase geometrically as the chip size increases.
Thanks for your email...I can let you know if we have any success if you are
interested for future reference.
-Eric
-----Original Message-----
From: Tae-Hoon Chung [mailto:thchung at tgen.org]
Sent: Thu 7/1/2004 7:52 PM
To: Kort, Eric
Cc: r-help at stat.math.ethz.ch
Subject: Re: [R] Absolute ceiling on R's memory usage = 4 gigabytes?
Hi, Eric.
It seems a little bit puzzling to me. Which Affymetrix chip do you use?
The reason I'm asking this is that yesterday I was able to normalize
150 HU-133A CEL files (containing 22283 probes) using R 1.9.1 in Mac OS
X 10.3.3 with 1.5 GB memory. If your chip has more probes than this,
then it must be understandable ...
On Jul 1, 2004, at 2:59 PM, Kort, Eric wrote:
> Hello. By way of background, I am running out of memory when
> attempting to normalize the data from 160 affymetrix microarrays using
> justRMA (from the affy package). This is despite making 6 gigabytes
> of swap space available on our sgi irix machine (which has 2 gigabytes
> of ram). I have seen in various discussions statements such as "you
> will need at least 6 gigabytes of memory to normalize that many
> chips", but my question is this:
>
> I cannot set the memory limits of R (1.9.1) higher than 4 gigabytes as
> attempting to do so results in this message:
>
> WARNING: --max-vsize=4098M=4098`M': too large and ignored
>
> I experience this both on my windows box (on which I cannot allocate
> more than 4 gigabytes of swap space anyway), and on an the above
> mentioned sgi irix machine (on which I can). In view of that, I do
> not see what good it does to make > 4 gigabytes of ram+swap space
> available. Does this mean 4 gigabytes is the absolute upper limit of
> R's memory usage...or perhaps 8 gigabytes since you can set both the
> stack and the heap size to 4 gigabytes?
>
> Thanks,
> Eric
>
>
> This email message, including any attachments, is for the
> so...{{dropped}}
>
> ______________________________________________
> R-help at stat.math.ethz.ch mailing list
> https://www.stat.math.ethz.ch/mailman/listinfo/r-help
> PLEASE do read the posting guide!
> http://www.R-project.org/posting-guide.html
>
>
Tae-Hoon Chung, Ph.D
Post-doctoral Research Fellow
Molecular Diagnostics and Target Validation Division
Translational Genomics Research Institute
1275 W Washington St, Tempe AZ 85281 USA
Phone: 602-343-8724
This email message, including any attachments, is for the so...{{dropped}}
Yes...unfortunately it looks like the lab that owns the irix does not have a
license for Sun's compilers which are required for compiling a 64bit R (I
used gcc to compile the 32 bit version, but have not had success compiling a 64
bit R with gcc on the irix). But I am sure we will manage to compile a 64 bit R
somewhere sometime soon.
Thanks,
Eric
-----Original Message-----
From: Paul Gilbert [mailto:pgilbert at bank-banque-canada.ca]
Sent: Fri 7/2/2004 10:39 AM
To: Kort, Eric
Cc: Tae-Hoon Chung; r-help at stat.math.ethz.ch
Subject: Re: [R] Absolute ceiling on R's memory usage = 4 gigabytes?
It looks like you have R compiled as a 32 bit application, and you will
need to compile it as a 64 bit application if you want to address more
than 4G memory. I am not familiar with the sgi irix machine, but you can
do this on many workstations that have processors with a 64 bit
architecture and an OS that supports it. The R-admin notes have some
hints about how to do this for various platforms.
Paul Gilbert
Kort, Eric wrote:
>Yes, we are using the HGU-133plus2 chips with 50,000+ probes, and I suppose
that the memory requirements increase geometrically as the chip size increases.
>
>Thanks for your email...I can let you know if we have any success if you
are interested for future reference.
>
>-Eric
>
> -----Original Message-----
> From: Tae-Hoon Chung [mailto:thchung at tgen.org]
> Sent: Thu 7/1/2004 7:52 PM
> To: Kort, Eric
> Cc: r-help at stat.math.ethz.ch
> Subject: Re: [R] Absolute ceiling on R's memory usage = 4
gigabytes?
>
>
>
> Hi, Eric.
> It seems a little bit puzzling to me. Which Affymetrix chip do you
use?
> The reason I'm asking this is that yesterday I was able to
normalize
> 150 HU-133A CEL files (containing 22283 probes) using R 1.9.1 in Mac
OS
> X 10.3.3 with 1.5 GB memory. If your chip has more probes than this,
> then it must be understandable ...
>
> On Jul 1, 2004, at 2:59 PM, Kort, Eric wrote:
>
> > Hello. By way of background, I am running out of memory when
> > attempting to normalize the data from 160 affymetrix
microarrays using
> > justRMA (from the affy package). This is despite making 6
gigabytes
> > of swap space available on our sgi irix machine (which has 2
gigabytes
> > of ram). I have seen in various discussions statements such as
"you
> > will need at least 6 gigabytes of memory to normalize that many
> > chips", but my question is this:
> >
> > I cannot set the memory limits of R (1.9.1) higher than 4
gigabytes as
> > attempting to do so results in this message:
> >
> > WARNING: --max-vsize=4098M=4098`M': too large and ignored
> >
> > I experience this both on my windows box (on which I cannot
allocate
> > more than 4 gigabytes of swap space anyway), and on an the
above
> > mentioned sgi irix machine (on which I can). In view of that,
I do
> > not see what good it does to make > 4 gigabytes of ram+swap
space
> > available. Does this mean 4 gigabytes is the absolute upper
limit of
> > R's memory usage...or perhaps 8 gigabytes since you can set
both the
> > stack and the heap size to 4 gigabytes?
> >
> > Thanks,
> > Eric
> >
> >
> > This email message, including any attachments, is for the
> > so...{{dropped}}
> >
> > ______________________________________________
> > R-help at stat.math.ethz.ch mailing list
> > https://www.stat.math.ethz.ch/mailman/listinfo/r-help
> > PLEASE do read the posting guide!
> > http://www.R-project.org/posting-guide.html
> >
> >
> Tae-Hoon Chung, Ph.D
>
> Post-doctoral Research Fellow
> Molecular Diagnostics and Target Validation Division
> Translational Genomics Research Institute
> 1275 W Washington St, Tempe AZ 85281 USA
> Phone: 602-343-8724
>
>
>
>
>This email message, including any attachments, is for the so...{{dropped}}
>
>______________________________________________
>R-help at stat.math.ethz.ch mailing list
>https://www.stat.math.ethz.ch/mailman/listinfo/r-help
>PLEASE do read the posting guide!
http://www.R-project.org/posting-guide.html
>
>
>
This email message, including any attachments, is for the so...{{dropped}}
Apparently Analagous Threads
- Memory problem ... Again
- Segmentation fault while using Mclust function of mclust library in R-2.0.1
- Way to make R idle for some time and try something again later
- e1071 question: what's the definition of performance in t une.* functions?
- Web-application using R