Displaying 20 results from an estimated 1000 matches similar to: "improve R memory under linux"
2010 Nov 04
1
Memory Management under Linux
Dear all,
I am using ubuntu linux 32 with 4 Gb. I am running a very small script and I always got the same error message: CAN NOT ALLOCATE A VECTOR OF SIZE 231.8 Mb.
I have reading carefully the instruction in ?Memory. Using the function gc() I got very low numbers of memory (please sea below). I know that it has been posted several times at r-help
2010 Nov 05
1
R memory allocation in Linux
Dear all,
I am using ubuntu linux 32 with 4 Gb. I am running a very small script and I always got the same error message: CAN NOT ALLOCATE A VECTOR OF SIZE 231.8 Mb.
I have reading carefully the instruction in ?Memory. Using the function gc() I got very low numbers of memory (please sea below). I know that it has been posted several times at r-help
2010 Nov 08
0
R memory allocation ubuntu
Dear all,
I am using ubuntu linux 32 with 4 Gb. I am running a
very small script and I always got the same error message: CAN NOT
ALLOCATE A VECTOR OF SIZE 231.8 Mb.
I have reading carefully the
instruction in ?Memory. Using the function gc() I got very low numbers
of memory (please sea below). I know that it has been posted several
times at r-help
2000 Nov 09
3
maximum of nsize=20000k ??
Dear R-ers,
somehow it is not possible to increase nsize to more than
20000k. When I specify e.g.
> R --vsize=10M --nsize=21000K
the result is:
free total (Mb)
Ncells 99658 350000 6.7
Vcells 1219173 1310720 10.0
Maybe I have overlooked s.th....
Marcus
--
+-------------------------------------------------------
| Marcus Eger
| E-Mail: eger.m at gmx.de (NEW)
|
2000 Feb 11
1
astonishing memory phenomenon
I have a question concerning memory.
I understood that R takes a fixed amount of memory at startup (which I can
influence with --vsize --nsize) and that gc() shows the memory still free of
the total memory reserved for R.
However, if I create a long vector of character data, gc() only seem to
reflect the space needed for a vector of pointers to char, the space used
for the character data itself
2005 Jun 29
3
Memory Management under Linux: Problems to allocate large amounts of data
Dear Group
I'm still trying to bring many data into R (see older postings). After solving some troubles with the database I do most of the work in MySQL. But still I could be nice to work on some data using R. Therefore I can use a dedicated Server with Gentoo Linux as OS hosting only R. This Server is a nice machine with two CPU and 4GB RAM which should do the job:
Dual Intel XEON 3.06 GHz
2004 Aug 18
1
Memory Problems in R
Hello everyone -
I have a couple of questions about memory management of large objects.
Thanks in advance for your response.
I'm running R version 1.9.1 on solaris 8, compiled as a 32 bit app.
My system has 12.0 GB of memory, with usually ~ 11GB free. I checked
system limits using ulimit, and there is nothing set that would limit
the maximum amount of memory for a process (with the
2002 Aug 06
2
Memory leak in R v1.5.1?
Hi,
I am trying to minimize a rather complex function of 5 parameters with
gafit and nlm. Besides some problems with both optimization algorithms
(with respect to consistantly generating similar results), I tried to
run this optimization about a hundred times for yet two other parameters.
Unfortunately, as the log below shows, during that batch process R
starts to eat up all my RAM,
1999 May 15
2
vsize and nsize
I am running R version ??? under Redhat 5.2. It seems as though the
--nsize object has no effct on the size of the allocated Ncells as
determined using gc(). Yes, I have that much data....
That is if I envoke R with
R --vsize 100 --nsize 5000000
then type
gc()
I get
free total
Ncells 92202 200000
Vcells 12928414 13107200
Thanks
Tony Long
Ecology and Evolutionary Biology
Steinhaus
2001 Jan 03
1
memory trouble
I don't know whether this belongs to r-devel or rather r-help.
Under RW1.11 --nsize=8M --vsize=512M I could
n <- 500000
m <- 20
x <- matrix(rnorm(n*m), ncol=m, nrow=n)
gc()
> n <- 500000
> m <- 20
> x <- matrix(rnorm(n*m), ncol=m, nrow=n)
> gc()
free total (Mb)
Ncells 8190509 8388608 160
Vcells 57033698 67108864 512
# under RW1.20 --vanilla
2015 Jan 15
2
default min-v/nsize parameters
Just wanted to start a discussion on whether R could ship with more
appropriate GC parameters. Right now, loading the recommended package
Matrix leads to:
> library(Matrix)
> gc()
used (Mb) gc trigger (Mb) max used (Mb)
Ncells 1076796 57.6 1368491 73.1 1198505 64.1
Vcells 1671329 12.8 2685683 20.5 1932418 14.8
Results may vary, but here R needed 64MB of N cells and 15MB
2015 Jan 17
0
default min-v/nsize parameters
Martin Morgan discussed this a year or so ago and as I recall bumped
up these values to the current defaults. I don't recall details about
why we didn't go higher -- maybe Martin does. I suspect the main
concern would be with small memory machines in student labs and less
developed countries. If there was a way on all platforms to identify
how much memory is available that might help to
2004 Mar 08
2
memory problem
I am trying to upload into R 143 Affymetrix chips onto using R on the NIH
Nimbus server. I can load 10 chips without a problem, however, when I try
to load 143 I receive a error message: cannot create a vector of 523263 KB.
I have expanded the memory of R as follows: R --min-vsize=10M
--max-vsize=2500M --min-nsize=10M -max-nsize=50M (as specified in help in
R). After running this command the
2000 Dec 14
2
cannot allocate vector of size in merge (PR#765)
Full_Name: Viktor Moravetski
Version: Version 1.2.0 (2000-12-13)
OS: Win-NT 4.0 SP5
Submission from: (NULL) (209.128.81.199)
I've started R (v.1.20) with command:
rgui --vsize 450M --nsize 40M
Then at the command prompt:
> gc()
used (Mb) gc trigger (Mb)
Ncells 358534 9.6 41943040 1120
Vcells 3469306 26.5 58982400 450
>df <- data.frame(x=1:30000,y=2,z=3)
2011 Nov 13
1
Understand Ncells and Vcells, from gc()
Dear all,
I am working on a 64 bits Linux system.
I issue the following R commands:
> rm(list=ls()) # To remove all objects in the workspace.
> gc() # To free memory.
used (Mb) gc trigger (Mb) max used (Mb)
Ncells 124250 6.7 350000 18.7 350000 18.7
Vcells 124547 1.0 786432 6.0 476934 3.7
> gc() # I had to do it again, don't know why!
used (Mb) gc trigger (Mb) max used (Mb)
Ncells
2003 Nov 21
1
R memory allocation error - Unix
I am using ESS on a unix system for my analysis. My R environment
contains a 90118 by 94 dataframe. I am trying to calculate the mean of a
column in this data frame and I am getting the following error:
Error: can not allocate a vector of size 704 Kb
I have tried
options(memory=1000000000000000000)
and this does not help.
when I call gc() this is what is returned
> gc()
used
1999 Apr 06
1
rw-faq clarification + simple question + bug(?)
Windows users note: the rw-faq says
|1.8) Can I use rw0xx with ESS and emacs?
|
|Yes. Some time soon versions of ESS (5.1.3 has a `somewhat rough'
|prototype for rw0632) will come with support for this version of R. If
|yours does not, edit essd-r.el to have
|
| (inferior-ess-start-args . "--ess"))
|
|and make sure you give the full path to Rterm.exe as the R executable.
2000 Aug 25
3
unexpected R crash - again
Sorry, but I lost this thread, so I sending this as a new message.
This is really a follow-up to a post from a couple days ago saying that
fisher.test from the ctest library crashed on the following data set:
> T
[,1] [,2]
[1,] 2 1
[2,] 2 1
[3,] 4 0
[4,] 8 0
[5,] 6 0
[6,] 0 0
[7,] 1 0
[8,] 1 1
[9,] 7 1
[10,] 8 2
[11,]
2010 May 20
1
ERROR: cannot allocate vector of size?
I've looked through all of the posts about this issue (and there are
plenty!) but I am still unable to solve the error. ERROR: cannot allocate
vector of size 455 Mb
I am using R 2.6.2 - x86_64 on a Linux x86_64 Redhat cluster system. When I
log in, based on the specs I provide [qsub -I -X -l arch=x86_64] I am
randomly assigned to a x86_64 node.
I am using package GenABEL. My data (~ 650,000
2007 Mar 28
2
Suggestion for memory optimization and as.double() with friends
Hi,
when doing as.double() on an object that is already a double, the
object seems to be copied internally, doubling the memory requirement.
See example below. Same for as.character() etc. Is this intended?
Example:
% R --vanilla
> x <- double(1e7)
> gc()
used (Mb) gc trigger (Mb) max used (Mb)
Ncells 234019 6.3 467875 12.5 350000 9.4
Vcells 10103774 77.1