Displaying 20 results from an estimated 2000 matches similar to: "astonishing memory phenomenon"
2008 Jul 20
2
Erro: cannot allocate vector of size 216.0 Mb
Please,
I have a 2GB computer and a huge time-series to embedd, and i tried
increasing memory.limit() and memory.size(max=TRUE), but nothing.
Just before the command:
> memory.size(max=TRUE)
[1] 13.4375
> memory.limit()
[1] 1535.875
> gc()
used (Mb) gc trigger (Mb) max used (Mb)
Ncells 209552 5.6 407500 10.9 350000 9.4
Vcells 125966 1.0 786432 6.0 496686 3.8
2004 Aug 18
1
Memory Problems in R
Hello everyone -
I have a couple of questions about memory management of large objects.
Thanks in advance for your response.
I'm running R version 1.9.1 on solaris 8, compiled as a 32 bit app.
My system has 12.0 GB of memory, with usually ~ 11GB free. I checked
system limits using ulimit, and there is nothing set that would limit
the maximum amount of memory for a process (with the
2005 Dec 14
2
The fastest way to select and execute a few selected functions inside a function
Dear useRs?
I have the following problem! I have a function that calls one or more
functions, depending on the input parameters. I am searching for the fastest
way to select and execute the selected functions and return their results in
a list. The number of possible functions is 10, however usually only 2 are
selected (although sometimes more, even all).
For examples, if I have function
2005 Jan 14
1
S3/S4 classes performance comparison
Hi R-devel,
If you did read my survey on Rhelp about reporting, you may have seen that
I am implementing a way to handle outputs for R (mainly target output
destinations: xHTML and TeX).
In fact: I does have something that works for basic objects, entirely done
with S4 classes, with the results visible at:
http://www.stat.ucl.ac.be/ROMA/sample.htm
http://www.stat.ucl.ac.be/ROMA/sample.pdf
To
2002 Oct 11
1
growing process size in simulation
I came across this in a simulation I ran under 1.6.0: If I do something
like
R> x <- rnorm(10)
R> rval <- NULL
R> for(i in 1:100000) rval <- t.test(x)$p.value
then the process size remains at about 14M under 1.5.1, but it seems to
be almost linearly growing up to more than 100M under 1.6.0.
I know that the above simulation is nonsense, but it was the simplest I
could come up
2002 Apr 29
1
Garbage collection: RW1041
Have searched through the archives but have been unable to find any related
issues - hopefully I'm not bringing up an old topic.
Am using RW1041 on a Windows NT on a machine with 1Gb of memory. Have a
function doit() that reads in a chunk of data using readBin, performs a
regression, saves out coeffs and then returns. When using Rgui with the
default memory limit of 256Mb I'm able to
2006 May 16
2
Large database help
Hello all.
I have a large .txt file whose variables are fixed-columns,
ie, variable V1 goes from columns 1 to 7, V2 from 8 to 23 etc.
This is a 60GB file with 90 variables and 60 million observations.
I'm working with a Pentium 4, 1GB RAM, Windows XP Pro.
I tried the following code just to see if I could work with 2 variables
but it seems not possible:
R : Copyright 2005, The R Foundation
2005 Nov 15
1
cannot.allocate.memory.again and 32bit<--->64bit
hello!
------
i use 32bit.Linux(SuSe)Server, so i'm limited with 3.5Gb of memory
i demonstrate, that there is times to times a problem with allocating of
objects of large size, for example
0.state (no objects yet created)
------------------------------------
> gc()
used (Mb) gc trigger (Mb) max used (Mb)
Ncells 162070 4.4 350000 9.4 350000
2009 Apr 26
6
Memory issues in R
How do people deal with R and memory issues?
I have tried using gc() to see how much memory is used at each step.
Scanned Crawley R-Book and all other R books I have available and the FAQ
on-line but no help really found.
Running WinXP Pro (32 bit) with 4 GB RAM.
One SATA drive pair is in RAID 0 configuration with 10000 MB allocated as
virtual memory.
I do have another machine
2008 Sep 24
2
cannot allocate memory
I am getting "Error: cannot allocate vector of size 197 MB".
I know that similar problems were discussed a lot already, but I
didn't find any satisfactory answers so far!
Details:
*** I have XP (32bit) with 4GB ram. At the time when the problem
appeared I had 1.5GB of available physical memory.
*** I increased R memory limit to 3GB via memory.limit(3000)
*** I did gs() and got
2001 Mar 13
3
gc() shrinks with multiple iterations
Is it expected behavior for gc() to return shrinking values as it gets
called multiple times? Here's what I've got:
> gc()
used (Mb) gc trigger (Mb)
Ncells 221754 6.0 467875 12.5
Vcells 3760209 28.7 14880310 113.6
> gc()
used (Mb) gc trigger (Mb)
Ncells 221760 6.0 467875 12.5
Vcells 3016206 23.1 11904247 90.9
> gc()
used (Mb) gc
2007 Jun 26
1
Memory Experimentation: Rule of Thumb = 10-15 Times the Memory
dear R experts:
I am of course no R experts, but use it regularly. I thought I would
share some experimentation with memory use. I run a linux machine
with about 4GB of memory, and R 2.5.0.
upon startup, gc() reports
used (Mb) gc trigger (Mb) max used (Mb)
Ncells 268755 14.4 407500 21.8 350000 18.7
Vcells 139137 1.1 786432 6.0 444750 3.4
This is my baseline. linux
2004 Aug 07
1
memory usage of S4 methods
Hi,
I have some problems with the memory usage of S4-generics. For example, I
observed the following behaviour:
> gc()
used (Mb) gc trigger (Mb)
Ncells 432091 11.6 531268 14.2
Vcells 116052 0.9 786432 6.0
> setClass("A",representation(x="numeric"));
[1] "A"
> setClass("B",representation(x="numeric"));
[1] "B"
2011 Jan 17
1
isoreg memory leak?
I believe there is a memory leak in isoreg in the current version of R,
as I believe the following shows
> gc()
used (Mb) gc trigger (Mb) max used (Mb)
Ncells 120405 3.3 350000 9.4 350000 9.4
Vcells 78639 0.6 786432 6.0 392463 3.0
> for(k in 1:100) {
+
+ y <- runif(10000)
+ isoreg(x,y)
+ }
> rm(x)
> rm(y)
> gc()
used (Mb) gc
2010 Nov 04
1
Memory Management under Linux
Dear all,
I am using ubuntu linux 32 with 4 Gb. I am running a very small script and I always got the same error message: CAN NOT ALLOCATE A VECTOR OF SIZE 231.8 Mb.
I have reading carefully the instruction in ?Memory. Using the function gc() I got very low numbers of memory (please sea below). I know that it has been posted several times at r-help
2010 Nov 05
1
improve R memory under linux
Dear all,
I am using ubuntu linux 32 with 4 Gb. I am running a very small script and I always got the same error message: CAN NOT ALLOCATE A VECTOR OF SIZE 231.8 Mb.
I have reading carefully the instruction in ?Memory. Using the function gc() I got very low numbers of memory (please sea below). I know that it has been posted several times at r-help
2004 Jul 03
4
counting the occurrences of vectors
Hi:
I have two matrices, A and B, where A is n x k, and B is m x k, where n >> m >> k. Is there a computationally fast way to count the number of times each row (a k-vector) of B occurs in A? Thanks for any suggestions.
Best,
Ravi.
[[alternative HTML version deleted]]
2012 Mar 04
1
hash table clean-up
Hello,
I have noticed that the memory usage inside an R session increases as
more and more objects with unique names are created, even after they
are removed. Here is a small reproducible example:
> gc()
used (Mb) gc trigger (Mb) max used (Mb)
Ncells 531720 14.2 899071 24.1 818163 21.9
Vcells 247949 1.9 786432 6.0 641735 4.9
>
> for (i in 1:100000) {
+ name <-
2010 Nov 05
1
R memory allocation in Linux
Dear all,
I am using ubuntu linux 32 with 4 Gb. I am running a very small script and I always got the same error message: CAN NOT ALLOCATE A VECTOR OF SIZE 231.8 Mb.
I have reading carefully the instruction in ?Memory. Using the function gc() I got very low numbers of memory (please sea below). I know that it has been posted several times at r-help
2002 Feb 01
1
Memory leak in read.table (PR#1292)
Full_Name: Ashley Ford
Version: 1.4.0
OS: Windows NT4
Submission from: (NULL) (146.80.9.20)
I am suffering from a memory leak in read.table in the new precompiled windows
1.4.
it works fine in 1.3
Create a 90000 line file of 7 variables eg
perl -e '$e=exp(1);for($i=0;$i<90000;$i++){printf "%d".(" %f"x6)."\n",
$i,$i*$e,3,4,5,6,7,8,9}' > n90000
R :