similar to: Help

Displaying 20 results from an estimated 3000 matches similar to: "Help"

2000 Mar 17
2
Windows Memory
I'm sure this question is answered in the help file, but likely I'm not reading it corrected. Running windows version 1.00.0, loading a table (35K rows by 10 columns) from Excel using the read.table command I receive the following message. Error: cons memory (350000 cells) exhausted See "help(Memory)" on how to increase the number of cons cells. >From reading the
2001 Aug 22
1
Huge workspace cannot be opened
Hi everyone, I have a problem that some people may have already encountered but i did not find the solution yet. As I use R to simulate several arrays of data, my workspace is now 35Mb big and I cannot launch R with it. An "xdr real data read error occured" and R tells me to delete .RData or increase memory. I WON'T delete this file and changing the max-nsize to 40600k did not
1999 Apr 27
2
Memory management
Dear all, I don't get it: First of all, the help doesn't say what are the memory limits of R. Say, what's the max heap size for instance ???? Secondly, I invoke R with the following commands each time: rgui --vsize 30M --nsize 1000K rgui --vsize 30M --nsize 2000K rgui --vsize 30M --nsize 3000K rgui --vsize 30M --nsize 4000K I try to open a matrix 8000x8000 by issuing
2001 Mar 01
3
How do you expand memory capability (Was: R crashes in Windows ME)
Hello- Since my data bank in SPSS has > 40 variables, I think that R crashes because of the memory limit. In Maindonald?s UsingR text, on pg 3, there?s a footnote that reads: "If you want larger memory space than the default you may want a target akin to <path to binary>\rw091\bin\rgui.exe --visize 30M --nsize 1000K [The default is --vsize 6M --nsize 250K
2000 Aug 25
3
unexpected R crash - again
Sorry, but I lost this thread, so I sending this as a new message. This is really a follow-up to a post from a couple days ago saying that fisher.test from the ctest library crashed on the following data set: > T [,1] [,2] [1,] 2 1 [2,] 2 1 [3,] 4 0 [4,] 8 0 [5,] 6 0 [6,] 0 0 [7,] 1 0 [8,] 1 1 [9,] 7 1 [10,] 8 2 [11,]
2001 Mar 21
3
memory allocation error
Hi, I have recently installed R-1.2.2 for windows (16MB RAM, P-166) and I getting the following message after processing my data (6 variables and 1200 observations): >Error: cannot allocate vector of size 4 Kb >In addition: Warning message: >Reached total allocation of 15Mb: see help(memory.size) Then, the program close. With the last version, 1.1.1 (I think) I didn't have this
1999 Apr 12
3
--nsize and --vsize
Martin M has suggested I widen this discussion to R-devel, and > I agree that we should increase them, > but I'm not sure at all about the amount. > > The default could even depend on the architecture (via "./configure").. Views, please. ------------- Begin Forwarded Message ------------- Is is not time we increased the defaults a bit? As the base gets bigger I hit
2001 Apr 02
1
Run out of memory
I am trying to use R to cluster 7129 samples, my data set is a 7129 x 38 matrix, when I trying to get the distance matrix using function dist( ), the memory exhausted, and I tried to set the memory when I run R by R --vsize=250M --nsize=1000k no matter what I set for vsize the result is the same, it says: Error: heap memory (256000kb) exhausted [need 198498Kb more]
1999 Nov 12
1
R-0.65.1 Startup
Dear R users, I have noticed that my R startup is extremely slow. It takes almost 3 minutes from "double-click" to R prompt. I have been running R-0.64.1 till recently and it took about 30 sec. I still have access to R-0.64.1. When I started it up, it took about 25 sec. Can anyone tell me if this is a bug in R or a problem with my machine? Note: This is after bootup with R being the
2010 Jan 19
2
Server hanging despite efforts to correct memory limits
My group is working with datasets between 100 Mb and 1 GB in size, using multiple log ins. From the documentation, it appears that vsize is limited to 2^30-1, which tends to prove too restrictive for our use. When we drop that restriction (set vsize = NA) we end up hanging the server, which requires a restart. Is there any way to increase the memory limits on R while keeping our jobs from
1999 May 15
2
vsize and nsize
I am running R version ??? under Redhat 5.2. It seems as though the --nsize object has no effct on the size of the allocated Ncells as determined using gc(). Yes, I have that much data.... That is if I envoke R with R --vsize 100 --nsize 5000000 then type gc() I get free total Ncells 92202 200000 Vcells 12928414 13107200 Thanks Tony Long Ecology and Evolutionary Biology Steinhaus
2000 Nov 09
3
maximum of nsize=20000k ??
Dear R-ers, somehow it is not possible to increase nsize to more than 20000k. When I specify e.g. > R --vsize=10M --nsize=21000K the result is: free total (Mb) Ncells 99658 350000 6.7 Vcells 1219173 1310720 10.0 Maybe I have overlooked s.th.... Marcus -- +------------------------------------------------------- | Marcus Eger | E-Mail: eger.m at gmx.de (NEW) |
2000 Oct 02
3
R vs S-PLUS with regard to memory usage
I am trying to translate code from S-PLUS to R and R really struggles! After starting R with the foll. R --vsize 50M --nsize 6M --no-restore on a 400 MHz Pentium with 192 MB of memory running Linux (RH 6.2), I run a function that essentially picks up an external dataset with 2121 rows and 30 columns and builds a lm() object and also runs step() ... the step() takes forever to run...(takes very
2009 May 07
1
increasing memory for R bg job
Hi, Is the following command used to increase the memory or any other command when a background R job is run? R --min-vsize=vl --max-vsize=vu --min-nsize=nl --max-nsize=nu --max-ppsize=N source: http://stat.ethz.ch/R-manual/R-patched/library/base/html/Memory.html Thx Carol [[alternative HTML version deleted]]
2000 Aug 17
2
R on os390
G'day R friends, I didn't get any replies on the main list so I thought I'd try with the experts. I was wondering if anyone's ported R to os390. If so, are the vsize and nsize limits the same as other platforms? I could really annoy those SAS guys then. thanks, John Strumila john.strumila@team.telstra.com
2009 Nov 30
1
allocating vector memory > 1 GByte on Windows XP / Vista / 7
Let me begin stating that I read all help files and faq's on the subject matter (there aren't more than about a dozen) but either did not find solutions or found them not to work. Here is the issue. I am trying to run a spatial regression on a medium-sized dataset. Part of the functions in the spdep package I use require me to allocate a vector of 1.1 Gb (mine is not a spatial SIG
1999 Jun 15
2
ESS and R
For anybody who uses ESS with R, how do you invoke the vsize and nsize options when you call R. I can't find any appropriate variables from an apropos. Thanks, Jord -- Jordan Howarth CSIRO Mathematical and Information Sciences mailto:jordan.howarth at cmis.csiro.au -.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.- r-help mailing list -- Read
2001 Mar 12
4
1.2.2 under M$ windows 2000 lots of plots out of memory?
hi- If I source the following for(k in seq(1:20)){ x<-runif(20000,min=-500,max=2000) y<-runif(20000,min=-500,max=2500) z<-runif(20000,min=-10,max=10) cat(k,"file",memory.size()) cc<-rainbow(11) plot(x,y,asp=1i,xlim=c(-500,2000),ylim=c(-500,2500),main=k,cex=1.0) for(i in seq(-10,10,2)){ points(x[z > i],y[z > i],col=cc[(12+i)/2],cex=1.0) } rm(x,y,z)
2005 Dec 20
1
Problems in batch mode
Dear R-users, I am trying to run some simulations in batch mode. In an older version of the program, I used rterm --vsize=100M --nsize=5000K --restore --save <input file> output file, however, in the new version R 2.2.0 , the parameters vsize and nsize are ignored. I can use the command memory.limit to increase memory, but I am not sure if this corresponds to vsize and nsize.
2005 Jun 29
3
Memory Management under Linux: Problems to allocate large amounts of data
Dear Group I'm still trying to bring many data into R (see older postings). After solving some troubles with the database I do most of the work in MySQL. But still I could be nice to work on some data using R. Therefore I can use a dedicated Server with Gentoo Linux as OS hosting only R. This Server is a nice machine with two CPU and 4GB RAM which should do the job: Dual Intel XEON 3.06 GHz