Displaying 20 results from an estimated 11000 matches similar to: "R jobs keep hanging linux server despite mem.limits modifcations"
2010 Jan 19
2
Server hanging despite efforts to correct memory limits
My group is working with datasets between 100 Mb and 1 GB in size, using
multiple log ins. From the documentation, it appears that vsize is limited
to 2^30-1, which tends to prove too restrictive for our use. When we drop
that restriction (set vsize = NA) we end up hanging the server, which
requires a restart. Is there any way to increase the memory limits on R
while keeping our jobs from
2012 Sep 21
1
Defunct of --max-vsize and mem.limits
R-devel,
I am migrating from R.2.13.2 to R.2.15.1 and just realize that R command line options --max-nsize and --max-vsize are no longer supported along with the defunct of mem.limits(). To me, the function and options along with other two, --min-nsize and --min-vsize, are useful in allowing some explicit control of R memory usage. One benefit is that the setting of maximum boundary could
2000 Jan 23
1
size limits
Hi,
I have a few questions about how to handle large data sets in R.
What is the size of the largest matrix that R can comfortably deal with?
Is this size limit imposed by R's software, or is it a question
of the machine that one runs on?
How does one go about choosing reasonable values of vsize
and nsize?
I have a data set with about 1,000,000 rows, and 30
1999 Apr 27
2
Memory management
Dear all,
I don't get it:
First of all, the help doesn't say what are the memory limits of
R. Say, what's the max heap size for instance ????
Secondly, I invoke R with the following commands each time:
rgui --vsize 30M --nsize 1000K
rgui --vsize 30M --nsize 2000K
rgui --vsize 30M --nsize 3000K
rgui --vsize 30M --nsize 4000K
I try to open a matrix 8000x8000 by issuing
2009 May 07
1
increasing memory for R bg job
Hi,
Is the following command used to increase the memory or any other command when a background R job is run?
R --min-vsize=vl --max-vsize=vu --min-nsize=nl --max-nsize=nu --max-ppsize=N
source:
http://stat.ethz.ch/R-manual/R-patched/library/base/html/Memory.html
Thx
Carol
[[alternative HTML version deleted]]
2001 Aug 22
1
Huge workspace cannot be opened
Hi everyone,
I have a problem that some people may have already encountered but i did not
find the solution yet.
As I use R to simulate several arrays of data, my workspace is now 35Mb big and
I cannot launch R with it.
An "xdr real data read error occured" and R tells me to delete .RData or
increase memory. I WON'T delete this file and changing the max-nsize to 40600k
did not
2009 Nov 30
1
allocating vector memory > 1 GByte on Windows XP / Vista / 7
Let me begin stating that I read all help files and faq's on the subject
matter (there aren't more than about a dozen) but either did not find
solutions or found them not to work.
Here is the issue. I am trying to run a spatial regression on a
medium-sized dataset. Part of the functions in the spdep package I use
require me to allocate a vector of 1.1 Gb (mine is not a spatial SIG
2009 Feb 01
0
setting a large value of --max-vsize
Hello,
I'm using a 64bit Linux with 16GB of RAM. I'd like to limit the memory
that the R process can use so I'm trying to use --max-vsize switch.
However, it is seems that I can't enforce a limit above 2GB.
shlomo at hippo:~$ uname -a
Linux hippo 2.6.24-16-generic #1 SMP Thu Apr 10 12:47:45 UTC 2008
x86_64 GNU/Linux
This WORKS:
--------------------
shlomo at hippo:~$ R
2005 Dec 20
1
Problems in batch mode
Dear R-users,
I am trying to run some simulations in batch mode. In an older version
of the program, I used
rterm --vsize=100M --nsize=5000K --restore --save <input file> output file,
however, in the new version R 2.2.0 , the parameters vsize and nsize are
ignored.
I can use the command memory.limit to increase memory, but I am not sure if
this corresponds to vsize and nsize.
2006 Mar 03
0
Memory problem
Hi list,
I am analysing a large dataset using random coefficient (using nlme) and
fixed effects (using lm function) models. I have problem with my R version
2. 2. 1 due to memory allocation difficulties. When I try to expand the
memory I get the following error message.
> R --min-vsize=10000000 --max-vsize=1000000000 --min-nsize=500
--max-nsize=10000000
Error: target of assignment expands
2008 Feb 29
1
can the matrix size limit be increased?
Hi there,
I'm brand new to R, so let me know if this question is not
appropriate for this list. I've been reading through the
documentation and have tried a number of things, but am pretty much
stuck so far. Here's the session info:
> sessionInfo()
R version 2.6.2 (2008-02-08)
i386-apple-darwin8.10.1
locale:
C
attached base packages:
[1] stats graphics grDevices
2003 Nov 19
11
Windows R 1.8.0 hangs when Mem Usage >1.8GB
I have a loop that increases the size of an object after each iteration. When the Windows Task Manager shows "Mem Usage" about 1.8GB, the Rgui.exe process no longer responds.
I use:
"C:\Program Files\R\rw1080\bin\Rgui.exe" --max-mem-size=4000M --min-vsize=10M --max-vsize=3000M --min-nsize=500k --max-nsize=1000M
I have a dual Xeon 2.8GHz processor box with 4GB of memory and
2008 Feb 12
2
Cox model
Hello R-community,
It's been a week now that I am struggling with the implementation of a cox
model in R. I have 80 cancer patients, so 80 time measurements and 80
relapse or no measurements (respective to censor, 1 if relapsed over the
examined period, 0 if not). My microarray data contain around 18000 genes.
So I have the expressions of 18000 genes in each of the 80 tumors (matrix
2005 Jun 29
3
Memory Management under Linux: Problems to allocate large amounts of data
Dear Group
I'm still trying to bring many data into R (see older postings). After solving some troubles with the database I do most of the work in MySQL. But still I could be nice to work on some data using R. Therefore I can use a dedicated Server with Gentoo Linux as OS hosting only R. This Server is a nice machine with two CPU and 4GB RAM which should do the job:
Dual Intel XEON 3.06 GHz
2000 Jul 20
1
bad R bug
Hi,
I am not on this mailing list, but here is a terrible bug that has
stopped me in my tracks. I am unable to remove observations from a data
matrix.
temp is the original matrix. Notice that there are 288 entries with a
104 in the first column. I attempt to remove these entries, but R does
not do it.
brad
ACTUAL COMMANDS:
> dim(temp)
[1] 30528 11
> table(temp[,1])
1 3 4
2001 Jan 03
1
memory trouble
I don't know whether this belongs to r-devel or rather r-help.
Under RW1.11 --nsize=8M --vsize=512M I could
n <- 500000
m <- 20
x <- matrix(rnorm(n*m), ncol=m, nrow=n)
gc()
> n <- 500000
> m <- 20
> x <- matrix(rnorm(n*m), ncol=m, nrow=n)
> gc()
free total (Mb)
Ncells 8190509 8388608 160
Vcells 57033698 67108864 512
# under RW1.20 --vanilla
2004 Mar 08
2
memory problem
I am trying to upload into R 143 Affymetrix chips onto using R on the NIH
Nimbus server. I can load 10 chips without a problem, however, when I try
to load 143 I receive a error message: cannot create a vector of 523263 KB.
I have expanded the memory of R as follows: R --min-vsize=10M
--max-vsize=2500M --min-nsize=10M -max-nsize=50M (as specified in help in
R). After running this command the
1999 Feb 15
1
.Rdata questions
Dear all,
in a current project I have a pretty huge .Rdata. Thus I was working with
R --vsize 100 --nsize 1000000. Today when I tried to restart R I get the
following error message:
Error: a read error occured
Fatal error: unable to restore saved data
(remove .RData or increase memory)
I increased memory up to --vsize 180 and --nsize 2000000, but the error
reoccurrs. Is there a way to know
2002 Apr 12
1
Problems with memory
Dear all,
I've started working with R (vs 1041) a few weeks ago, and now I'm
having problems with the amount of memory.
I'm working on the windows-me, my computer has 128 Mb of memory. I'm
using the R under the emacs (ESS-5.1.20) and it is started by the
command:
Rterm --min-vsize=10M --max-vsize=100M --min-nsize=500k --max-nsize=1M
I've been had problems when executing a
2004 Aug 18
1
Memory Problems in R
Hello everyone -
I have a couple of questions about memory management of large objects.
Thanks in advance for your response.
I'm running R version 1.9.1 on solaris 8, compiled as a 32 bit app.
My system has 12.0 GB of memory, with usually ~ 11GB free. I checked
system limits using ulimit, and there is nothing set that would limit
the maximum amount of memory for a process (with the