similar to: increase memory

Displaying 20 results from an estimated 10000 matches similar to: "increase memory"

2000 Nov 09
4
memory management
dear experts, i m very concerned about memory management. would appreciate if you leave me some tips on handling large datasets.. special interset: 1. importing large data from a text file 2. subsequent manipulations in R thanks very much best regards pan yuming -.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.- r-help mailing list -- Read
2000 Nov 16
2
assign names to matrix
dear all, i have a matrix and i dont know how to assign names to this matrix. given v is 100x5 matrix, and label -> c("A","B","C","D","E") idealy, names(v) <- label, but it doesnt work for different length if dimnames(v) <- list(1:nrow(v),label), then names(v) return NULL any smart ways? thanks in advance. best regards pan yuming
2000 Nov 07
1
matrix transpose and object name
hello, everybody, i have several unresolved issues: 1. how to do a matrix transpose? i cant find how to do it in the documentation. 2..suppose i have an object named as lm1 from linear regression, how could i refer to it by paste("lm",1,sep="") ? 3. to save a matrix 100 x 30, how to have a text file with 100 lines and 30 cols, instead of stacking them? thanks. best regards
2009 May 07
1
increasing memory for R bg job
Hi, Is the following command used to increase the memory or any other command when a background R job is run? R --min-vsize=vl --max-vsize=vu --min-nsize=nl --max-nsize=nu --max-ppsize=N source: http://stat.ethz.ch/R-manual/R-patched/library/base/html/Memory.html Thx Carol [[alternative HTML version deleted]]
2010 Jan 19
2
Server hanging despite efforts to correct memory limits
My group is working with datasets between 100 Mb and 1 GB in size, using multiple log ins. From the documentation, it appears that vsize is limited to 2^30-1, which tends to prove too restrictive for our use. When we drop that restriction (set vsize = NA) we end up hanging the server, which requires a restart. Is there any way to increase the memory limits on R while keeping our jobs from
1999 Sep 13
2
increasing memory size
Help! I've done this before but can't remember how to do it, and can't find any reference to it in the docs I have access to now. I need to increase --vsize (I think), as I'm getting a message when I start my R session "vector heap too small to load data". How does one exactly do this, and what is the default setting? I don't know how much to increase it by. I know
1999 Apr 27
2
Memory management
Dear all, I don't get it: First of all, the help doesn't say what are the memory limits of R. Say, what's the max heap size for instance ???? Secondly, I invoke R with the following commands each time: rgui --vsize 30M --nsize 1000K rgui --vsize 30M --nsize 2000K rgui --vsize 30M --nsize 3000K rgui --vsize 30M --nsize 4000K I try to open a matrix 8000x8000 by issuing
2000 Oct 02
3
R vs S-PLUS with regard to memory usage
I am trying to translate code from S-PLUS to R and R really struggles! After starting R with the foll. R --vsize 50M --nsize 6M --no-restore on a 400 MHz Pentium with 192 MB of memory running Linux (RH 6.2), I run a function that essentially picks up an external dataset with 2121 rows and 30 columns and builds a lm() object and also runs step() ... the step() takes forever to run...(takes very
2000 Mar 17
2
Windows Memory
I'm sure this question is answered in the help file, but likely I'm not reading it corrected. Running windows version 1.00.0, loading a table (35K rows by 10 columns) from Excel using the read.table command I receive the following message. Error: cons memory (350000 cells) exhausted See "help(Memory)" on how to increase the number of cons cells. >From reading the
2006 Jun 25
1
R memory size increases
O/S : Solaris 9 R version : 2.2.1 I was getting out of memory errors from R when running a large job, so I've switched to a larger machine with 40G shared memory. I issue the following command when starting R to increase memory available to R: R --save --min-vsize=4G --min-nsize=4G When reading in a file, R responds with "could not allocate vector of size 146Kb." So I'm
2000 Apr 10
2
Increasing memory size in ESS
I am having a problem using ESS with R. In particular I have large data objects which exceed the 6Mb default heap memory. Outside of ESS I can run R by specifying large values of --vsize and --nsize but I can't figure out how to do this in ESS. Any help would be much appreciated. -- **************************************************** ** Angelo J. Canty **
1999 Nov 12
1
R-0.65.1 Startup
Dear R users, I have noticed that my R startup is extremely slow. It takes almost 3 minutes from "double-click" to R prompt. I have been running R-0.64.1 till recently and it took about 30 sec. I still have access to R-0.64.1. When I started it up, it took about 25 sec. Can anyone tell me if this is a bug in R or a problem with my machine? Note: This is after bootup with R being the
2009 Nov 30
1
allocating vector memory > 1 GByte on Windows XP / Vista / 7
Let me begin stating that I read all help files and faq's on the subject matter (there aren't more than about a dozen) but either did not find solutions or found them not to work. Here is the issue. I am trying to run a spatial regression on a medium-sized dataset. Part of the functions in the spdep package I use require me to allocate a vector of 1.1 Gb (mine is not a spatial SIG
2001 Mar 12
4
1.2.2 under M$ windows 2000 lots of plots out of memory?
hi- If I source the following for(k in seq(1:20)){ x<-runif(20000,min=-500,max=2000) y<-runif(20000,min=-500,max=2500) z<-runif(20000,min=-10,max=10) cat(k,"file",memory.size()) cc<-rainbow(11) plot(x,y,asp=1i,xlim=c(-500,2000),ylim=c(-500,2500),main=k,cex=1.0) for(i in seq(-10,10,2)){ points(x[z > i],y[z > i],col=cc[(12+i)/2],cex=1.0) } rm(x,y,z)
2004 Mar 08
2
memory problem
I am trying to upload into R 143 Affymetrix chips onto using R on the NIH Nimbus server. I can load 10 chips without a problem, however, when I try to load 143 I receive a error message: cannot create a vector of 523263 KB. I have expanded the memory of R as follows: R --min-vsize=10M --max-vsize=2500M --min-nsize=10M -max-nsize=50M (as specified in help in R). After running this command the
2005 Jun 29
3
Memory Management under Linux: Problems to allocate large amounts of data
Dear Group I'm still trying to bring many data into R (see older postings). After solving some troubles with the database I do most of the work in MySQL. But still I could be nice to work on some data using R. Therefore I can use a dedicated Server with Gentoo Linux as OS hosting only R. This Server is a nice machine with two CPU and 4GB RAM which should do the job: Dual Intel XEON 3.06 GHz
2001 Mar 01
3
How do you expand memory capability (Was: R crashes in Windows ME)
Hello- Since my data bank in SPSS has > 40 variables, I think that R crashes because of the memory limit. In Maindonald?s UsingR text, on pg 3, there?s a footnote that reads: "If you want larger memory space than the default you may want a target akin to <path to binary>\rw091\bin\rgui.exe --visize 30M --nsize 1000K [The default is --vsize 6M --nsize 250K
2001 Jul 16
2
Trouble with the memory allocation
Dear R-users, I am currently facing what appears to be a strange thing (at least to my humble understanding). If I understood correctly, starting with the version 1.2.3, R memory allocation can be done dynamically, and there is no need to fiddle with the --nsize and --vsize parameter any longer... So far this everything seemed to go this way (I saw the size of my processes growing when I was
2002 Apr 12
1
Problems with memory
Dear all, I've started working with R (vs 1041) a few weeks ago, and now I'm having problems with the amount of memory. I'm working on the windows-me, my computer has 128 Mb of memory. I'm using the R under the emacs (ESS-5.1.20) and it is started by the command: Rterm --min-vsize=10M --max-vsize=100M --min-nsize=500k --max-nsize=1M I've been had problems when executing a
1999 May 15
2
vsize and nsize
I am running R version ??? under Redhat 5.2. It seems as though the --nsize object has no effct on the size of the allocated Ncells as determined using gc(). Yes, I have that much data.... That is if I envoke R with R --vsize 100 --nsize 5000000 then type gc() I get free total Ncells 92202 200000 Vcells 12928414 13107200 Thanks Tony Long Ecology and Evolutionary Biology Steinhaus