Displaying 20 results from an estimated 4000 matches similar to: "vsize and nsize"
2000 Nov 09
3
maximum of nsize=20000k ??
Dear R-ers,
somehow it is not possible to increase nsize to more than
20000k. When I specify e.g.
> R --vsize=10M --nsize=21000K
the result is:
free total (Mb)
Ncells 99658 350000 6.7
Vcells 1219173 1310720 10.0
Maybe I have overlooked s.th....
Marcus
--
+-------------------------------------------------------
| Marcus Eger
| E-Mail: eger.m at gmx.de (NEW)
|
2015 Jan 15
2
default min-v/nsize parameters
Just wanted to start a discussion on whether R could ship with more
appropriate GC parameters. Right now, loading the recommended package
Matrix leads to:
> library(Matrix)
> gc()
used (Mb) gc trigger (Mb) max used (Mb)
Ncells 1076796 57.6 1368491 73.1 1198505 64.1
Vcells 1671329 12.8 2685683 20.5 1932418 14.8
Results may vary, but here R needed 64MB of N cells and 15MB
1999 Apr 12
3
--nsize and --vsize
Martin M has suggested I widen this discussion to R-devel, and
> I agree that we should increase them,
> but I'm not sure at all about the amount.
>
> The default could even depend on the architecture (via "./configure")..
Views, please.
------------- Begin Forwarded Message -------------
Is is not time we increased the defaults a bit? As the base gets bigger
I hit
2004 Jul 20
1
--max-vsize and --max-nsize linux?
Hi,
somtimes i have trivial recodings like this:
> dim(tt)
[1] 252382 98
system.time(for(i in 2:length(tt)){
tt[,i][is.na(tt[,i])] <- 0
})
...and a win2000(XP2000+,1GB) machine makes it in several minutes, but
my linux notebook (XP2.6GHZ,512MB) don't get success after some hours.
I recognize that the cpu load is most time relative small, but the hardisk
2005 Jun 29
3
Memory Management under Linux: Problems to allocate large amounts of data
Dear Group
I'm still trying to bring many data into R (see older postings). After solving some troubles with the database I do most of the work in MySQL. But still I could be nice to work on some data using R. Therefore I can use a dedicated Server with Gentoo Linux as OS hosting only R. This Server is a nice machine with two CPU and 4GB RAM which should do the job:
Dual Intel XEON 3.06 GHz
2004 Aug 18
1
Memory Problems in R
Hello everyone -
I have a couple of questions about memory management of large objects.
Thanks in advance for your response.
I'm running R version 1.9.1 on solaris 8, compiled as a 32 bit app.
My system has 12.0 GB of memory, with usually ~ 11GB free. I checked
system limits using ulimit, and there is nothing set that would limit
the maximum amount of memory for a process (with the
2004 Mar 08
2
memory problem
I am trying to upload into R 143 Affymetrix chips onto using R on the NIH
Nimbus server. I can load 10 chips without a problem, however, when I try
to load 143 I receive a error message: cannot create a vector of 523263 KB.
I have expanded the memory of R as follows: R --min-vsize=10M
--max-vsize=2500M --min-nsize=10M -max-nsize=50M (as specified in help in
R). After running this command the
2000 Dec 14
2
cannot allocate vector of size in merge (PR#765)
Full_Name: Viktor Moravetski
Version: Version 1.2.0 (2000-12-13)
OS: Win-NT 4.0 SP5
Submission from: (NULL) (209.128.81.199)
I've started R (v.1.20) with command:
rgui --vsize 450M --nsize 40M
Then at the command prompt:
> gc()
used (Mb) gc trigger (Mb)
Ncells 358534 9.6 41943040 1120
Vcells 3469306 26.5 58982400 450
>df <- data.frame(x=1:30000,y=2,z=3)
2010 Nov 04
1
Memory Management under Linux
Dear all,
I am using ubuntu linux 32 with 4 Gb. I am running a very small script and I always got the same error message: CAN NOT ALLOCATE A VECTOR OF SIZE 231.8 Mb.
I have reading carefully the instruction in ?Memory. Using the function gc() I got very low numbers of memory (please sea below). I know that it has been posted several times at r-help
2010 Nov 05
1
improve R memory under linux
Dear all,
I am using ubuntu linux 32 with 4 Gb. I am running a very small script and I always got the same error message: CAN NOT ALLOCATE A VECTOR OF SIZE 231.8 Mb.
I have reading carefully the instruction in ?Memory. Using the function gc() I got very low numbers of memory (please sea below). I know that it has been posted several times at r-help
2000 Aug 25
3
unexpected R crash - again
Sorry, but I lost this thread, so I sending this as a new message.
This is really a follow-up to a post from a couple days ago saying that
fisher.test from the ctest library crashed on the following data set:
> T
[,1] [,2]
[1,] 2 1
[2,] 2 1
[3,] 4 0
[4,] 8 0
[5,] 6 0
[6,] 0 0
[7,] 1 0
[8,] 1 1
[9,] 7 1
[10,] 8 2
[11,]
2010 Nov 05
1
R memory allocation in Linux
Dear all,
I am using ubuntu linux 32 with 4 Gb. I am running a very small script and I always got the same error message: CAN NOT ALLOCATE A VECTOR OF SIZE 231.8 Mb.
I have reading carefully the instruction in ?Memory. Using the function gc() I got very low numbers of memory (please sea below). I know that it has been posted several times at r-help
1999 Oct 06
2
R --nsize 2M runs havoc (under linux)
Dear All,
I am running R version 0.65.0 under
a) Suse-Linux 6.1, and Suse-Linux 6.2, compiler gcc-2.95, CPUs pentium pro
200, 128MB, and pentium II 450, 128MB
b) Solaris 5.7, compiler gcc-2.95, cpu SUN sparc, 4000MB
When I set --nsize to more than 1M, R's internal storage management runs
havoc. gc() indicates the requested sizes, but the overall process size is
much too big: Running R with
2000 Feb 11
1
astonishing memory phenomenon
I have a question concerning memory.
I understood that R takes a fixed amount of memory at startup (which I can
influence with --vsize --nsize) and that gc() shows the memory still free of
the total memory reserved for R.
However, if I create a long vector of character data, gc() only seem to
reflect the space needed for a vector of pointers to char, the space used
for the character data itself
2005 Jan 03
2
Memory problem ... Again
Happy new year to all;
A few days ago, I posted similar problem. At that time, I found out that our
R program had been 32-bit compiled, not 64-bit compiled. So the R program
has been re-installed in 64-bit and run the same job, reading in 150
Affymetrix U133A v2 CEL files and perform dChip processing. However, the
memory problem happened again. Since the amount of physical memory is 64GB,
I think
2000 Mar 13
1
check does not accept --vsize option (PR#481)
Full_Name: Markus Neteler
Version: 1.0.0
OS: Linux 2.2.10/i686
Submission from: (NULL) (130.75.72.37)
Hi,
I wanted to "check" the R.GRASS GIS interface from Roger Bivand:
http://www.geog.uni-hannover.de/grass/statsgrasslist.html
using
R CMD check --vsize=10M GRASS
but:
[error message shortened]
> G <- gmeta()
Error: heap memory (6144 Kb) exhausted [needed 1024 Kb more]
2000 May 30
6
heap size trouble
Hi ,
I ''ve got a trouble with using R.
When I want to load a file that contains 93 thousand raws and 22 colums
of data (essentially float)
R shows me this error message
"heap size trouble"
Does anyone could tell me what parameter shall I precise before
launching R in order to load my big file.
Thanks a lot
-------------- next part --------------
A non-text attachment was
2015 Jan 18
2
default min-v/nsize parameters
On Thu, Jan 15, 2015 at 3:55 PM, Michael Lawrence
<lawrence.michael at gene.com> wrote:
> Just wanted to start a discussion on whether R could ship with more
> appropriate GC parameters.
I've been doing a number of similar measurements, and have come to the
same conclusion. R is currently very conservative about memory usage,
and this leads to unnecessarily poor performance on
1999 Apr 06
1
rw-faq clarification + simple question + bug(?)
Windows users note: the rw-faq says
|1.8) Can I use rw0xx with ESS and emacs?
|
|Yes. Some time soon versions of ESS (5.1.3 has a `somewhat rough'
|prototype for rw0632) will come with support for this version of R. If
|yours does not, edit essd-r.el to have
|
| (inferior-ess-start-args . "--ess"))
|
|and make sure you give the full path to Rterm.exe as the R executable.
2015 Jan 20
1
default min-v/nsize parameters
>>>>> Peter Haverty <haverty.peter at gene.com>
>>>>> on Mon, 19 Jan 2015 08:50:08 -0800 writes:
> Hi All, This is a very important issue. It would be very
> sad to leave most users unaware of a free speedup of this
> size. These options don't appear in the R --help
> output. They really should be added there.
Indeed,