Displaying 20 results from an estimated 1000 matches similar to: "Run out of memory"
2001 Jan 14
2
Help
Dear sir,
I am using R in windows. I want to extend R Memory
size.
I use the following command, but unfortunately it
doesn't work.
-- vsize=15M --nsize=1000K
Your help is appreciated.
Thanks,
Esmail Amiri.
__________________________________________________
Do You Yahoo!?
Get email at your own domain with Yahoo! Mail.
http://personal.mail.yahoo.com/
2000 Mar 17
2
Windows Memory
I'm sure this question is answered in the help file, but likely I'm not reading it corrected.
Running windows version 1.00.0, loading a table (35K rows by 10 columns) from Excel using the read.table command I receive the following message.
Error: cons memory (350000 cells) exhausted
See "help(Memory)" on how to increase the number of cons cells.
>From reading the
2001 Mar 01
3
How do you expand memory capability (Was: R crashes in Windows ME)
Hello-
Since my data bank in SPSS has > 40 variables, I think that R crashes because of the memory limit.
In Maindonald?s UsingR text, on pg 3, there?s a footnote that reads:
"If you want larger memory space than the default you may want a target akin to
<path to binary>\rw091\bin\rgui.exe --visize 30M --nsize 1000K
[The default is --vsize 6M --nsize 250K
2001 Aug 22
1
Huge workspace cannot be opened
Hi everyone,
I have a problem that some people may have already encountered but i did not
find the solution yet.
As I use R to simulate several arrays of data, my workspace is now 35Mb big and
I cannot launch R with it.
An "xdr real data read error occured" and R tells me to delete .RData or
increase memory. I WON'T delete this file and changing the max-nsize to 40600k
did not
1999 Apr 27
2
Memory management
Dear all,
I don't get it:
First of all, the help doesn't say what are the memory limits of
R. Say, what's the max heap size for instance ????
Secondly, I invoke R with the following commands each time:
rgui --vsize 30M --nsize 1000K
rgui --vsize 30M --nsize 2000K
rgui --vsize 30M --nsize 3000K
rgui --vsize 30M --nsize 4000K
I try to open a matrix 8000x8000 by issuing
2000 Aug 25
3
unexpected R crash - again
Sorry, but I lost this thread, so I sending this as a new message.
This is really a follow-up to a post from a couple days ago saying that
fisher.test from the ctest library crashed on the following data set:
> T
[,1] [,2]
[1,] 2 1
[2,] 2 1
[3,] 4 0
[4,] 8 0
[5,] 6 0
[6,] 0 0
[7,] 1 0
[8,] 1 1
[9,] 7 1
[10,] 8 2
[11,]
2001 Mar 21
3
memory allocation error
Hi,
I have recently installed R-1.2.2 for windows (16MB RAM, P-166) and I
getting the following message after processing my data (6 variables and
1200 observations):
>Error: cannot allocate vector of size 4 Kb
>In addition: Warning message:
>Reached total allocation of 15Mb: see help(memory.size)
Then, the program close.
With the last version, 1.1.1 (I think) I didn't have this
2008 Feb 12
2
Cox model
Hello R-community,
It's been a week now that I am struggling with the implementation of a cox
model in R. I have 80 cancer patients, so 80 time measurements and 80
relapse or no measurements (respective to censor, 1 if relapsed over the
examined period, 0 if not). My microarray data contain around 18000 genes.
So I have the expressions of 18000 genes in each of the 80 tumors (matrix
1999 Nov 12
1
R-0.65.1 Startup
Dear R users,
I have noticed that my R startup is extremely slow. It takes almost 3
minutes from "double-click" to R prompt. I have been running R-0.64.1 till
recently and it took about 30 sec. I still have access to R-0.64.1. When I
started it up, it took about 25 sec. Can anyone tell me if this is a bug in
R or a problem with my machine?
Note: This is after bootup with R being the
2009 Feb 01
0
setting a large value of --max-vsize
Hello,
I'm using a 64bit Linux with 16GB of RAM. I'd like to limit the memory
that the R process can use so I'm trying to use --max-vsize switch.
However, it is seems that I can't enforce a limit above 2GB.
shlomo at hippo:~$ uname -a
Linux hippo 2.6.24-16-generic #1 SMP Thu Apr 10 12:47:45 UTC 2008
x86_64 GNU/Linux
This WORKS:
--------------------
shlomo at hippo:~$ R
1999 Jul 29
0
scan
Is there a way to efficiently read large datasets directly into a matrix
byrow? I know data.frame, but for large datasets it doesn't efficiently
work, also if I increase the cons memory.
R --nsize 1000k --vsize 90M
...
> x<-read.table("pendler.luft.txt")
Error: cons memory (1000000 cells) exhausted
See "help(Memory)" on how to increase the number of cons
2000 Feb 10
0
Re: your mail about Memory on Windows95
>I use WinNT. You have to "launch" R from a DOS-shell window; you 1st
>change to the directory where you have RGui.exe, then type :
> Rgui --vsize 15M --nsize 1000k
>
>and R opens with increased memore size. It works here with my NT box, and
>will probably do with Win95 (though I think that the DOS-shells are not
>strictly similar in both OSs but this may not be a
2010 Jan 19
2
Server hanging despite efforts to correct memory limits
My group is working with datasets between 100 Mb and 1 GB in size, using
multiple log ins. From the documentation, it appears that vsize is limited
to 2^30-1, which tends to prove too restrictive for our use. When we drop
that restriction (set vsize = NA) we end up hanging the server, which
requires a restart. Is there any way to increase the memory limits on R
while keeping our jobs from
1999 May 15
2
vsize and nsize
I am running R version ??? under Redhat 5.2. It seems as though the
--nsize object has no effct on the size of the allocated Ncells as
determined using gc(). Yes, I have that much data....
That is if I envoke R with
R --vsize 100 --nsize 5000000
then type
gc()
I get
free total
Ncells 92202 200000
Vcells 12928414 13107200
Thanks
Tony Long
Ecology and Evolutionary Biology
Steinhaus
1999 Apr 12
3
--nsize and --vsize
Martin M has suggested I widen this discussion to R-devel, and
> I agree that we should increase them,
> but I'm not sure at all about the amount.
>
> The default could even depend on the architecture (via "./configure")..
Views, please.
------------- Begin Forwarded Message -------------
Is is not time we increased the defaults a bit? As the base gets bigger
I hit
2000 Nov 09
3
maximum of nsize=20000k ??
Dear R-ers,
somehow it is not possible to increase nsize to more than
20000k. When I specify e.g.
> R --vsize=10M --nsize=21000K
the result is:
free total (Mb)
Ncells 99658 350000 6.7
Vcells 1219173 1310720 10.0
Maybe I have overlooked s.th....
Marcus
--
+-------------------------------------------------------
| Marcus Eger
| E-Mail: eger.m at gmx.de (NEW)
|
2004 Jul 20
1
--max-vsize and --max-nsize linux?
Hi,
somtimes i have trivial recodings like this:
> dim(tt)
[1] 252382 98
system.time(for(i in 2:length(tt)){
tt[,i][is.na(tt[,i])] <- 0
})
...and a win2000(XP2000+,1GB) machine makes it in several minutes, but
my linux notebook (XP2.6GHZ,512MB) don't get success after some hours.
I recognize that the cpu load is most time relative small, but the hardisk
2009 Nov 30
1
allocating vector memory > 1 GByte on Windows XP / Vista / 7
Let me begin stating that I read all help files and faq's on the subject
matter (there aren't more than about a dozen) but either did not find
solutions or found them not to work.
Here is the issue. I am trying to run a spatial regression on a
medium-sized dataset. Part of the functions in the spdep package I use
require me to allocate a vector of 1.1 Gb (mine is not a spatial SIG
2000 Oct 02
3
R vs S-PLUS with regard to memory usage
I am trying to translate code from S-PLUS to R and R really struggles!
After starting R with the foll.
R --vsize 50M --nsize 6M --no-restore
on a 400 MHz Pentium with 192 MB of memory running Linux (RH 6.2),
I run a function that essentially picks up an external dataset with 2121
rows
and 30 columns and builds a lm() object and also runs step() ... the step()
takes forever to run...(takes very
2009 May 07
1
increasing memory for R bg job
Hi,
Is the following command used to increase the memory or any other command when a background R job is run?
R --min-vsize=vl --max-vsize=vu --min-nsize=nl --max-nsize=nu --max-ppsize=N
source:
http://stat.ethz.ch/R-manual/R-patched/library/base/html/Memory.html
Thx
Carol
[[alternative HTML version deleted]]