> From: Susan Shortreed [mailto:susanms at stat.washington.edu]
>
> I am using ESS on a unix system for my analysis. My R
> environment contains a 90118 by 94 dataframe. I am trying to
> calculate the mean of a column in this data frame and I am
> getting the following error:
>
> Error: can not allocate a vector of size 704 Kb
>
> I have tried
> options(memory=1000000000000000000)
> and this does not help.
>
> when I call gc() this is what is returned
> > gc()
> used (Mb) gc trigger (Mb)
> Ncells 1178845 31.5 2564037 68.5
> Vcells 11666683 89.1 38686231 295.2
>
> I tried calling mem.limits(nsize=1000000000000). Any value
> for vsize gives an NA error, and when I recall gc() the limit
> for Vcells is NA. There is more than enough memory available
> on the Unix machine, when I call top I am using 0.0% of the
> memory and the other handful of users are using about 10% all
> together. I have increased my user memory limit and that
> still did not help (I found an email in R-help archives
> suggesting this). It seems to me that 704Kb is a rather
> small size to give an error and it appears to be available on
> the system.
>
> Any suggestions?
What exactly is this "unix" system? This could be important, because
I was
recently made aware of a problem on AIX 5, where R was compiled as 32-bit.
By default the R process will only have access to 256 MB of RAM, even though
ulimit -a reports unlimited and the machines has GBs of RAM. I had to set
an evironment variable before starting R to get it to use up two 4GB of
memory.
This may not be your problem, but you do need to provide more details about
your system.
Andy
> Thank you,
> Susan
>
> ______________________________________________
> R-help at stat.math.ethz.ch mailing list
> https://www.stat.math.ethz.ch/mailman/listinfo> /r-help
>