similar to: gc() shrinks with multiple iterations

Displaying 20 results from an estimated 2000 matches similar to: "gc() shrinks with multiple iterations"

2007 Mar 28
2
Suggestion for memory optimization and as.double() with friends
Hi, when doing as.double() on an object that is already a double, the object seems to be copied internally, doubling the memory requirement. See example below. Same for as.character() etc. Is this intended? Example: % R --vanilla > x <- double(1e7) > gc() used (Mb) gc trigger (Mb) max used (Mb) Ncells 234019 6.3 467875 12.5 350000 9.4 Vcells 10103774 77.1
2005 Jun 10
1
gc() and gc trigger
hello, the question concerning to the memory used and g.c. after having removed objects. What is wrong? bevor ------- > gc() used (Mb) gc trigger (Mb) max used (Mb) Ncells 313142 8.4 1801024 48.1 1835812 49.1 Vcells 809238 6.2 142909728 1090.4 178426948 1361.3 hier all attached objects
2002 Apr 29
1
Garbage collection: RW1041
Have searched through the archives but have been unable to find any related issues - hopefully I'm not bringing up an old topic. Am using RW1041 on a Windows NT on a machine with 1Gb of memory. Have a function doit() that reads in a chunk of data using readBin, performs a regression, saves out coeffs and then returns. When using Rgui with the default memory limit of 256Mb I'm able to
2010 Jul 07
3
Large discrepancies in the same object being saved to .RData
Hi developers, After some investigation I have found there can be large discrepancies in the same object being saved as an external "xx.RData" file. The immediate repercussion of this is the possible increased size of your .RData workspace for no apparent reason. The function and its three scenarios below highlight these discrepancies. Note that the object being returned is exactly
2006 Jan 26
1
maximizing available memory under windows XP
I have always been using ebitbin to set the 3GB switch in the windows binary, but version 2.2.1 has this set as default (which I verified using dumpbin). However, when I generate junk data to fill up my memory and read the memory usage using gc(), it seems that I am not getting as good results with 2.2.1 patched as I was with 2.2.0 after I edited the header. Under R 2.2.0 I was able to use over
2007 Aug 23
1
.Call and to reclaim the memory by allocVector
Hi, I am not sure if this is a bug and I apologize if it is something I didn't read carefully in the R extension manual. My initial search on the R help and R devel list archive didn't find useful information. I am using .Call (as written in the R extension manual) for the C code and have found that the .Call didn't release the memory claimed by allocVector. Even after applying
2007 Aug 23
1
.Call and to reclaim the memory by allocVector
Hi, I am not sure if this is a bug and I apologize if it is something I didn't read carefully in the R extension manual. My initial search on the R help and R devel list archive didn't find useful information. I am using .Call (as written in the R extension manual) for the C code and have found that the .Call didn't release the memory claimed by allocVector. Even after applying
2002 Feb 01
1
Memory leak in read.table (PR#1292)
Full_Name: Ashley Ford Version: 1.4.0 OS: Windows NT4 Submission from: (NULL) (146.80.9.20) I am suffering from a memory leak in read.table in the new precompiled windows 1.4. it works fine in 1.3 Create a 90000 line file of 7 variables eg perl -e '$e=exp(1);for($i=0;$i<90000;$i++){printf "%d".(" %f"x6)."\n", $i,$i*$e,3,4,5,6,7,8,9}' > n90000 R :
2008 Jan 30
1
Understanding an R improvement that already occurred.
I was surprised to observe the following difference between 2.4.1 and 2.6.0 after a long overdue upgrade a few months ago of our departmental server. It wasn't a bug fix, but a subtle improvement. Here's the simplest example I could create. The size is excessive, on the order of the Netflix Competition data. The integer matrix is about 1.12 GB, and if coerced to numeric it is 2.24 GB.
2002 Oct 14
1
R 1.6.0 Solaris crash with xmalloc: out of virtual memory
[some de-capitalization of *SXP done manually by mailing list maintainer ; the originally was caught as potential spam. MM] I have a little R program that crashes with the message xmalloc: out of virtual memory The code has a repeat{} loop that watches the sizes of some files. When there's an increase it updates things by reading the last 65 lines of each file, doing some
2008 Apr 07
0
Some memory questions: data.frame and lists.
Hi there, I seek your expert opinion on the following memory related questions. The output below was gotten from R-2.6.2, compiled with --enable-memory-profiling on Ubuntu Linux. ======================================================================= >>> Code and output 1: > gc( ) used (Mb) gc trigger (Mb) max used (Mb) Ncells 131180 7.1 350000 18.7 350000 18.7
2011 Nov 13
1
Understand Ncells and Vcells, from gc()
Dear all, I am working on a 64 bits Linux system. I issue the following R commands: > rm(list=ls()) # To remove all objects in the workspace. > gc() # To free memory. used (Mb) gc trigger (Mb) max used (Mb) Ncells 124250 6.7 350000 18.7 350000 18.7 Vcells 124547 1.0 786432 6.0 476934 3.7 > gc() # I had to do it again, don't know why! used (Mb) gc trigger (Mb) max used (Mb) Ncells
2012 May 25
1
R memory allocation
Dear All, I am running R in a system with the following configuration *Processor: Intel(R) Xeon(R) CPU X5650 @ 2.67GHz OS: Ubuntu X86_64 10.10 RAM: 24 GB* The R session info is * R version 2.14.1 (2011-12-22) Platform: x86_64-pc-linux-gnu (64-bit) locale: [1] LC_CTYPE=en_US.UTF-8 LC_NUMERIC=C [3] LC_TIME=en_US.UTF-8 LC_COLLATE=en_US.UTF-8 [5] LC_MONETARY=en_US.UTF-8
2007 May 18
1
AIX testers needed
Per the request to test the latest tarball referenced below, I have built R on AIX 5.3. There is a memory issue, please see 3) below. 1) Build with --enable-BLAS-shlib option. Builds and passes "make check". 2) GNU libiconv was installed; R configured *without* the --without- iconv option. Builds and passes "make check." 3) Memory issue: a)
2011 Nov 13
1
To moderator
No. But it has not been posted either. You got that message because you sent your message to the wrong address. You should have sent it to r-help at r-project.org You had probably sent it to r-help-request at r-project.org which would have had the effect that the server would have tried to interpret the contents of you message as commands (e.g. to unsubscribe, change your subscription
2001 Nov 26
2
R not giving memory back to system?
This might be because I didn't get it right, but; I thought R would release memory back to the system as (big) objects get removed? Here is my platform (with 1Gb of RAM): platform sparc-sun-solaris2.8 arch sparc os solaris2.8 system sparc, solaris2.8 status major 1 minor 3.1 year 2001 month 08 day 31 language R A little example: Start a new section of R, with
2000 Feb 11
1
astonishing memory phenomenon
I have a question concerning memory. I understood that R takes a fixed amount of memory at startup (which I can influence with --vsize --nsize) and that gc() shows the memory still free of the total memory reserved for R. However, if I create a long vector of character data, gc() only seem to reflect the space needed for a vector of pointers to char, the space used for the character data itself
2016 Nov 11
0
Memory leak with tons of closed connections
>>>>> Gergely Dar?czi <daroczig at rapporter.net> >>>>> on Thu, 10 Nov 2016 16:48:12 +0100 writes: > Dear All, > I'm developing an R application running inside of a Java daemon on > multiple threads, and interacting with the parent daemon via stdin and > stdout. > Everything works perfectly fine except for having some
2010 Dec 23
1
speed issues? read R_inferno by Patrick Burns: & a memory query
Hi, I'm just starting out with R and came across R_inferno.pdf by Patrick Burns just yesterday - I recommend it! His description of how 'growing' objects (e.g. obj <- c(obj, additionalValue) eats up memory prompted me to rewrite a function (which made such calls ~210 times) so that it used indexing into a dimensioned object instead (i.e. obj[i, ] <- additionalValue). This
2007 Sep 27
0
Unnecessary extra copy with matrix(..., dimnames=NULL) (Was: Re: modifying large R objects in place)
As others already mentioned, in your example you are first creating an integer matrix and the coercing it to a double matrix by assigning (double) 1 to element [1,1]. However, even when correcting for this mistake, there is an extra copy created when using matrix(). Try this in a fresh vanilla R session: > print(gc()) used (Mb) gc trigger (Mb) max used (Mb) Ncells 136684 3.7