Displaying 3 results from an estimated 3 matches for "509mb".
Did you mean:
500mb
2006 May 21
1
increase size of the limit of memory (PR#8885)
...2.2.1
OS: windows
Submission from: (NULL) (128.59.110.149)
if you run the following code in R after version 2.1.1:
n.sims<-6000;n<-30000
y<-array(NA,c(n.sims,n))
with output:
Error: cannot allocate vector of size 703125 Kb
In addition: Warning messages:
1: Reached total allocation of 509Mb: see help(memory.size)
2: Reached total allocation of 509Mb: see help(memory.size)
then I try to increase the size by command which made sense in R before version
2.0.1 and can help to allocate vector y of size 703125 Kb:
memory.limit(14000000000)
but now we get the output in such away:
Error...
2005 Feb 16
1
problem with da.mix
Hello,
We use the mix package and we have a problem with the DA function. We aren't
sure, but it's maybbe a memory problem.
We have done:
> Ent<--read.table("C:/.../File.txt")
> attach(Ent)
> Ent
V1 V2 V3 V4 ... V16 V17
1 1 1 2 6 18 18
2 1 1 1 NA 14 17
3 1 1 2 1 16 14
....
199 2 1 NA 7 19 18
200
2008 Nov 29
75
Slow death-spiral with zfs gzip-9 compression
I am [trying to] perform a test prior to moving my data to solaris and zfs. Things are going very poorly. Please suggest what I might do to understand what is going on, report a meaningful bug report, fix it, whatever!
Both to learn what the compression could be, and to induce a heavy load to expose issues, I am running with compress=gzip-9.
I have two machines, both identical 800MHz P3 with