similar to: Memory Problems in R

Displaying 20 results from an estimated 1000 matches similar to: "Memory Problems in R"

2005 Jan 14
1
S3/S4 classes performance comparison
Hi R-devel, If you did read my survey on Rhelp about reporting, you may have seen that I am implementing a way to handle outputs for R (mainly target output destinations: xHTML and TeX). In fact: I does have something that works for basic objects, entirely done with S4 classes, with the results visible at: http://www.stat.ucl.ac.be/ROMA/sample.htm http://www.stat.ucl.ac.be/ROMA/sample.pdf To
2010 Oct 10
2
GC verbose=false still showing report
I must be reading the help file for gc() wrong. I thought it said that gc(verbose=FALSE) will run the garbage collection without printing the Ncells/Vcells summary. However, this is what I get: gc(verbose = FALSE) used (Mb) gc trigger (Mb) max used (Mb) Ncells 267097 14.3 531268 28.4 531268 28.4 Vcells 429302 3.3 20829406 159.0 55923977 426.7 I'm embedding this in an
2004 Aug 07
1
memory usage of S4 methods
Hi, I have some problems with the memory usage of S4-generics. For example, I observed the following behaviour: > gc() used (Mb) gc trigger (Mb) Ncells 432091 11.6 531268 14.2 Vcells 116052 0.9 786432 6.0 > setClass("A",representation(x="numeric")); [1] "A" > setClass("B",representation(x="numeric")); [1] "B"
2003 Jun 02
1
'methods' and environments.
Hi, I have quite some trouble with the package methods. "Environments" in R are a convenient way to emulate pointers (and avoid copies of large objects, or of large collections of objects). So far, so good, but the package methods is becoming more (and more) problematic to work with. Up to version R-1.7.0, slots that were environments were still references to an environment, but I
2004 Jan 14
2
R internal data types
I am trying to figure out R data types and/or storage mode. For example: > #From a clean workspace > gc() used (Mb) gc trigger (Mb) Ncells 415227 11.1 597831 16 Vcells 103533 0.8 786432 6 > x <- seq(0,100000,1) > is.integer(x) [1] FALSE > is.double(x) [1] TRUE > object.size(x) [1] 800036 > gc() used (Mb) gc trigger (Mb) Ncells 415247
2007 Mar 01
4
R File IO Slow?
Is R file IO slow in general or am I missing something? It takes me 5 minutes to do a load(MYFILE) where MYFILE is a 27 MB Rdata file. Is there any way to speed this up? The one idea I have is having R call a C or Perl routine, reading the file in that language, converting the data in to R objects, then sending them back into R. This is more work that I want to do, however, in loading Rdata
2006 May 22
1
win2k memory problem with merge()'ing repeatedly (long email)
Good afternoon, I have a 63 small .csv files which I process daily, and until two weeks ago they processed just fine and only took a matter of moments and had non noticeable memory problem. Two weeks ago they have reached 318 lines and my script "broke". There are some missing-values in some of the files. I have tried hard many times over the last two weeks to create a
2011 Nov 13
1
Understand Ncells and Vcells, from gc()
Dear all, I am working on a 64 bits Linux system. I issue the following R commands: > rm(list=ls()) # To remove all objects in the workspace. > gc() # To free memory. used (Mb) gc trigger (Mb) max used (Mb) Ncells 124250 6.7 350000 18.7 350000 18.7 Vcells 124547 1.0 786432 6.0 476934 3.7 > gc() # I had to do it again, don't know why! used (Mb) gc trigger (Mb) max used (Mb) Ncells
2000 Feb 11
1
astonishing memory phenomenon
I have a question concerning memory. I understood that R takes a fixed amount of memory at startup (which I can influence with --vsize --nsize) and that gc() shows the memory still free of the total memory reserved for R. However, if I create a long vector of character data, gc() only seem to reflect the space needed for a vector of pointers to char, the space used for the character data itself
2011 Nov 13
1
To moderator
No. But it has not been posted either. You got that message because you sent your message to the wrong address. You should have sent it to r-help at r-project.org You had probably sent it to r-help-request at r-project.org which would have had the effect that the server would have tried to interpret the contents of you message as commands (e.g. to unsubscribe, change your subscription
2001 Mar 13
3
gc() shrinks with multiple iterations
Is it expected behavior for gc() to return shrinking values as it gets called multiple times? Here's what I've got: > gc() used (Mb) gc trigger (Mb) Ncells 221754 6.0 467875 12.5 Vcells 3760209 28.7 14880310 113.6 > gc() used (Mb) gc trigger (Mb) Ncells 221760 6.0 467875 12.5 Vcells 3016206 23.1 11904247 90.9 > gc() used (Mb) gc
2008 Jul 20
2
Erro: cannot allocate vector of size 216.0 Mb
Please, I have a 2GB computer and a huge time-series to embedd, and i tried increasing memory.limit() and memory.size(max=TRUE), but nothing. Just before the command: > memory.size(max=TRUE) [1] 13.4375 > memory.limit() [1] 1535.875 > gc() used (Mb) gc trigger (Mb) max used (Mb) Ncells 209552 5.6 407500 10.9 350000 9.4 Vcells 125966 1.0 786432 6.0 496686 3.8
2002 Oct 11
1
growing process size in simulation
I came across this in a simulation I ran under 1.6.0: If I do something like R> x <- rnorm(10) R> rval <- NULL R> for(i in 1:100000) rval <- t.test(x)$p.value then the process size remains at about 14M under 1.5.1, but it seems to be almost linearly growing up to more than 100M under 1.6.0. I know that the above simulation is nonsense, but it was the simplest I could come up
2007 Aug 16
2
Possible memory leak with R v.2.5.0
I'm working with a very large matrix ( 22k rows x 2k cols) of RNA expression data with R v.2.5.0 on a RedHat Enterprise machine, x86_64 architecture. The relevant code is below, but I call a function that takes a cluster of this data ( a list structure that contains a $rows elt which lists the rows (genes ) in the cluster by ID, but not the actual data itself ). The
2002 Apr 29
1
Garbage collection: RW1041
Have searched through the archives but have been unable to find any related issues - hopefully I'm not bringing up an old topic. Am using RW1041 on a Windows NT on a machine with 1Gb of memory. Have a function doit() that reads in a chunk of data using readBin, performs a regression, saves out coeffs and then returns. When using Rgui with the default memory limit of 256Mb I'm able to
2001 Nov 26
2
R not giving memory back to system?
This might be because I didn't get it right, but; I thought R would release memory back to the system as (big) objects get removed? Here is my platform (with 1Gb of RAM): platform sparc-sun-solaris2.8 arch sparc os solaris2.8 system sparc, solaris2.8 status major 1 minor 3.1 year 2001 month 08 day 31 language R A little example: Start a new section of R, with
2005 Dec 14
2
The fastest way to select and execute a few selected functions inside a function
Dear useRs? I have the following problem! I have a function that calls one or more functions, depending on the input parameters. I am searching for the fastest way to select and execute the selected functions and return their results in a list. The number of possible functions is 10, however usually only 2 are selected (although sometimes more, even all). For examples, if I have function
2005 Nov 15
1
cannot.allocate.memory.again and 32bit<--->64bit
hello! ------ i use 32bit.Linux(SuSe)Server, so i'm limited with 3.5Gb of memory i demonstrate, that there is times to times a problem with allocating of objects of large size, for example 0.state (no objects yet created) ------------------------------------ > gc() used (Mb) gc trigger (Mb) max used (Mb) Ncells 162070 4.4 350000 9.4 350000
2008 Sep 24
2
cannot allocate memory
I am getting "Error: cannot allocate vector of size 197 MB". I know that similar problems were discussed a lot already, but I didn't find any satisfactory answers so far! Details: *** I have XP (32bit) with 4GB ram. At the time when the problem appeared I had 1.5GB of available physical memory. *** I increased R memory limit to 3GB via memory.limit(3000) *** I did gs() and got
2006 May 16
2
Large database help
Hello all. I have a large .txt file whose variables are fixed-columns, ie, variable V1 goes from columns 1 to 7, V2 from 8 to 23 etc. This is a 60GB file with 90 variables and 60 million observations. I'm working with a Pentium 4, 1GB RAM, Windows XP Pro. I tried the following code just to see if I could work with 2 variables but it seems not possible: R : Copyright 2005, The R Foundation