Displaying 20 results from an estimated 4000 matches similar to: "setting a large value of --max-vsize"
2012 Sep 21
1
Defunct of --max-vsize and mem.limits
R-devel,
I am migrating from R.2.13.2 to R.2.15.1 and just realize that R command line options --max-nsize and --max-vsize are no longer supported along with the defunct of mem.limits(). To me, the function and options along with other two, --min-nsize and --min-vsize, are useful in allowing some explicit control of R memory usage. One benefit is that the setting of maximum boundary could
2004 Jul 20
1
--max-vsize and --max-nsize linux?
Hi,
somtimes i have trivial recodings like this:
> dim(tt)
[1] 252382 98
system.time(for(i in 2:length(tt)){
tt[,i][is.na(tt[,i])] <- 0
})
...and a win2000(XP2000+,1GB) machine makes it in several minutes, but
my linux notebook (XP2.6GHZ,512MB) don't get success after some hours.
I recognize that the cpu load is most time relative small, but the hardisk
2011 Jul 21
1
--max-vsize
Hi,
In both R 2.13 and the SVN trunk, I observe odd behaviour with the
--max-vsize command-line argument:
1. passing a largeish value (about 260M or greater) makes mem.limits()
report NA for the vsize limit; gc() continues to report a value...
2. ...but that value (and the actual limit) is wrong by a factor of 8.
I attach a patch for issue 2, lightly tested. I believe that fixing
issue 1
2000 Mar 13
1
check does not accept --vsize option (PR#481)
Full_Name: Markus Neteler
Version: 1.0.0
OS: Linux 2.2.10/i686
Submission from: (NULL) (130.75.72.37)
Hi,
I wanted to "check" the R.GRASS GIS interface from Roger Bivand:
http://www.geog.uni-hannover.de/grass/statsgrasslist.html
using
R CMD check --vsize=10M GRASS
but:
[error message shortened]
> G <- gmeta()
Error: heap memory (6144 Kb) exhausted [needed 1024 Kb more]
1999 May 15
2
vsize and nsize
I am running R version ??? under Redhat 5.2. It seems as though the
--nsize object has no effct on the size of the allocated Ncells as
determined using gc(). Yes, I have that much data....
That is if I envoke R with
R --vsize 100 --nsize 5000000
then type
gc()
I get
free total
Ncells 92202 200000
Vcells 12928414 13107200
Thanks
Tony Long
Ecology and Evolutionary Biology
Steinhaus
1999 Apr 12
3
--nsize and --vsize
Martin M has suggested I widen this discussion to R-devel, and
> I agree that we should increase them,
> but I'm not sure at all about the amount.
>
> The default could even depend on the architecture (via "./configure")..
Views, please.
------------- Begin Forwarded Message -------------
Is is not time we increased the defaults a bit? As the base gets bigger
I hit
2008 Feb 12
2
Cox model
Hello R-community,
It's been a week now that I am struggling with the implementation of a cox
model in R. I have 80 cancer patients, so 80 time measurements and 80
relapse or no measurements (respective to censor, 1 if relapsed over the
examined period, 0 if not). My microarray data contain around 18000 genes.
So I have the expressions of 18000 genes in each of the 80 tumors (matrix
2001 Apr 02
1
Run out of memory
I am trying to use R to cluster 7129 samples, my data set is a 7129 x 38 matrix, when I trying to get the distance matrix using function dist( ), the memory exhausted, and I tried to set the memory when I run R by
R --vsize=250M --nsize=1000k
no matter what I set for vsize the result is the same, it says:
Error: heap memory (256000kb) exhausted [need 198498Kb more]
2010 Jan 19
2
Server hanging despite efforts to correct memory limits
My group is working with datasets between 100 Mb and 1 GB in size, using
multiple log ins. From the documentation, it appears that vsize is limited
to 2^30-1, which tends to prove too restrictive for our use. When we drop
that restriction (set vsize = NA) we end up hanging the server, which
requires a restart. Is there any way to increase the memory limits on R
while keeping our jobs from
1999 Apr 27
2
Memory management
Dear all,
I don't get it:
First of all, the help doesn't say what are the memory limits of
R. Say, what's the max heap size for instance ????
Secondly, I invoke R with the following commands each time:
rgui --vsize 30M --nsize 1000K
rgui --vsize 30M --nsize 2000K
rgui --vsize 30M --nsize 3000K
rgui --vsize 30M --nsize 4000K
I try to open a matrix 8000x8000 by issuing
2006 Nov 29
2
--max-vsize option
The R memory docs say that the --*-vsize option takes an integer
argument and then 'G', 'M', 'K', or 'k'. When I start R using
R --max-vsize=10G
I receive the warning:
WARNING: --max-vsize=10G=10'M': too large and ignored
The system that I'm working on is a 64-bit Sun server with 40G of
memory. What is the correct syntax for this command?
Daniel
2010 Jan 18
0
R jobs keep hanging linux server despite mem.limits modifcations
My group is working with datasets between 100 Mb and 1 GB in size, using
multiple log ins. From the documentation, it appears that vsize is limited
to 2^30-1, which tends to prove too restrictive for our use. When we drop
that restriction (set vsize = NA) we end up hanging the server, which
requires a restart. Is there any way to increase the memory limits on R
while keeping our jobs from
2001 Aug 22
1
Huge workspace cannot be opened
Hi everyone,
I have a problem that some people may have already encountered but i did not
find the solution yet.
As I use R to simulate several arrays of data, my workspace is now 35Mb big and
I cannot launch R with it.
An "xdr real data read error occured" and R tells me to delete .RData or
increase memory. I WON'T delete this file and changing the max-nsize to 40600k
did not
2009 Nov 30
1
allocating vector memory > 1 GByte on Windows XP / Vista / 7
Let me begin stating that I read all help files and faq's on the subject
matter (there aren't more than about a dozen) but either did not find
solutions or found them not to work.
Here is the issue. I am trying to run a spatial regression on a
medium-sized dataset. Part of the functions in the spdep package I use
require me to allocate a vector of 1.1 Gb (mine is not a spatial SIG
2005 Dec 20
1
Problems in batch mode
Dear R-users,
I am trying to run some simulations in batch mode. In an older version
of the program, I used
rterm --vsize=100M --nsize=5000K --restore --save <input file> output file,
however, in the new version R 2.2.0 , the parameters vsize and nsize are
ignored.
I can use the command memory.limit to increase memory, but I am not sure if
this corresponds to vsize and nsize.
2006 Mar 03
0
Memory problem
Hi list,
I am analysing a large dataset using random coefficient (using nlme) and
fixed effects (using lm function) models. I have problem with my R version
2. 2. 1 due to memory allocation difficulties. When I try to expand the
memory I get the following error message.
> R --min-vsize=10000000 --max-vsize=1000000000 --min-nsize=500
--max-nsize=10000000
Error: target of assignment expands
2000 Jan 23
1
size limits
Hi,
I have a few questions about how to handle large data sets in R.
What is the size of the largest matrix that R can comfortably deal with?
Is this size limit imposed by R's software, or is it a question
of the machine that one runs on?
How does one go about choosing reasonable values of vsize
and nsize?
I have a data set with about 1,000,000 rows, and 30
2009 May 07
1
increasing memory for R bg job
Hi,
Is the following command used to increase the memory or any other command when a background R job is run?
R --min-vsize=vl --max-vsize=vu --min-nsize=nl --max-nsize=nu --max-ppsize=N
source:
http://stat.ethz.ch/R-manual/R-patched/library/base/html/Memory.html
Thx
Carol
[[alternative HTML version deleted]]
2005 Jun 29
3
Memory Management under Linux: Problems to allocate large amounts of data
Dear Group
I'm still trying to bring many data into R (see older postings). After solving some troubles with the database I do most of the work in MySQL. But still I could be nice to work on some data using R. Therefore I can use a dedicated Server with Gentoo Linux as OS hosting only R. This Server is a nice machine with two CPU and 4GB RAM which should do the job:
Dual Intel XEON 3.06 GHz
2000 Jul 20
1
bad R bug
Hi,
I am not on this mailing list, but here is a terrible bug that has
stopped me in my tracks. I am unable to remove observations from a data
matrix.
temp is the original matrix. Notice that there are 288 entries with a
104 in the first column. I attempt to remove these entries, but R does
not do it.
brad
ACTUAL COMMANDS:
> dim(temp)
[1] 30528 11
> table(temp[,1])
1 3 4