Displaying 20 results from an estimated 2000 matches similar to: "R not giving memory back to system?"
2008 Jul 20
2
Erro: cannot allocate vector of size 216.0 Mb
Please,
I have a 2GB computer and a huge time-series to embedd, and i tried
increasing memory.limit() and memory.size(max=TRUE), but nothing.
Just before the command:
> memory.size(max=TRUE)
[1] 13.4375
> memory.limit()
[1] 1535.875
> gc()
         used (Mb) gc trigger (Mb) max used (Mb)
Ncells 209552  5.6     407500 10.9   350000  9.4
Vcells 125966  1.0     786432  6.0   496686  3.8
2008 Sep 24
2
cannot allocate memory
I am getting "Error: cannot allocate vector of size 197 MB".
I know that similar problems were discussed a lot already, but I
didn't find any satisfactory answers so far!
Details:
*** I have XP (32bit) with 4GB ram. At the time when the problem
appeared I had 1.5GB of available physical memory.
*** I increased R memory limit to 3GB via memory.limit(3000)
*** I did gs() and got
     
2010 Jul 07
3
Large discrepancies in the same object being saved to .RData
Hi developers,
After some investigation I have found there can be large discrepancies in the same object being saved as an external "xx.RData" file. The immediate repercussion of this is the possible increased size of your .RData workspace for no apparent reason.
The function and its three scenarios below highlight these discrepancies. Note that the object being returned is exactly
2009 Apr 26
6
Memory issues in R
How do people deal with R and memory issues?
   I have tried using gc() to see how much memory is used at each step.
   Scanned Crawley R-Book and all other R books I have available and the FAQ
   on-line but no help really found.
   Running WinXP Pro (32 bit) with 4 GB RAM.
   One SATA drive pair is in RAID 0 configuration with 10000 MB allocated as
   virtual memory.
   I do have another machine
2007 Jun 26
1
Memory Experimentation: Rule of Thumb = 10-15 Times the Memory
dear R experts:
I am of course no R experts, but use it regularly.  I thought I would
share some experimentation  with memory use.  I run a linux machine
with about 4GB of memory, and R 2.5.0.
upon startup, gc() reports
         used (Mb) gc trigger (Mb) max used (Mb)
Ncells 268755 14.4     407500 21.8   350000 18.7
Vcells 139137  1.1     786432  6.0   444750  3.4
This is my baseline.  linux
2002 Sep 18
1
problem with make fullcheck on Sparc Solaris 8
I have been trying out R-1.6.0 tarballs (2002-9-10 and 2002-9-17) on:
arch     sparc
os       solaris2.8
system   sparc, solaris2.8
status   beta
major    1
minor    6.0
year     2002
month    09
day      17
language R
As you see form above, R-1.6.0 compiles fine and works.  However, when I 
  "make fullcheck" I get the following error:
running code in 'tools-Ex.R' ... OK
2002 Sep 20
0
problem with make on sparc solaris 8 ( R-1.6.0beta_2002-09-18.tar.gz)
This is something that I have not seen in earlier beta versions of 1.6.0:
.
.
.
   ts.plot                           text    html    latex   example
   ts.union                          text    html    latex   example
   tsSmooth                          text    html    latex
   tsdiag                            text    html    latex   example
R_LIBS= ../../../bin/R CMD INSTALL
ERROR: no packages
2001 Mar 13
3
gc() shrinks with multiple iterations
Is it expected behavior for gc() to return shrinking values as it gets
called multiple times? Here's what I've got:
> gc()
          used (Mb) gc trigger  (Mb)
Ncells  221754  6.0     467875  12.5
Vcells 3760209 28.7   14880310 113.6
> gc()
          used (Mb) gc trigger (Mb)
Ncells  221760  6.0     467875 12.5
Vcells 3016206 23.1   11904247 90.9
> gc()
          used (Mb) gc
2011 Nov 13
1
Understand Ncells and Vcells, from gc()
Dear all,
I am working on a 64 bits Linux system.
I issue the following R commands:
 > rm(list=ls()) # To remove all objects in the workspace.
 > gc() # To free memory.
used (Mb) gc trigger (Mb) max used (Mb)
Ncells 124250 6.7 350000 18.7 350000 18.7
Vcells 124547 1.0 786432 6.0 476934 3.7
 > gc() # I had to do it again, don't know why!
used (Mb) gc trigger (Mb) max used (Mb)
Ncells
2007 Aug 09
1
Memory Experimentation: Rule of Thumb = 10-15 Times the Memory
Hi,
I've been having similar experiences and haven't been able to
substantially improve the efficiency using the guidance in the I/O
Manual.
Could anyone advise on how to improve the following scan()?  It is not
based on my real file, please assume that I do need to read in
characters, and can't do any pre-processing of the file, etc.
## Create Sample File
2011 Nov 13
1
To moderator
No. But it has not been posted either.
You got that message because you sent your message to
the wrong address. You should have sent it to
  r-help at r-project.org
You had probably sent it to
  r-help-request at r-project.org
which would have had the effect that the server would have
tried to interpret the contents of you message as commands
(e.g. to unsubscribe, change your subscription
2006 Nov 06
2
gc()$Vcells < 0 (PR#9345)
Full_Name: Don Maszle
Version: 2.3.0
OS: x86_64-unknown-linux-gnu
Submission from: (NULL) (206.86.87.3)
# On our new 32 GB x86_64 machine
R : Copyright 2006, The R Foundation for Statistical Computing
Version 2.3.0 (2006-04-24)
ISBN 3-900051-07-0
R is free software and comes with ABSOLUTELY NO WARRANTY.
You are welcome to redistribute it under certain conditions.
Type 'license()' or
2000 Feb 11
1
astonishing memory phenomenon
I have a question concerning memory.
I understood that R takes a fixed amount of memory at startup (which I can
influence with --vsize --nsize) and that gc() shows the memory still free of
the total memory reserved for R.
However, if I create a long vector of character data, gc() only seem to
reflect the space needed for a vector of pointers to char, the space used
for the character data itself
2016 Nov 11
0
Memory leak with tons of closed connections
>>>>> Gergely Dar?czi <daroczig at rapporter.net>
>>>>>     on Thu, 10 Nov 2016 16:48:12 +0100 writes:
    > Dear All,
    > I'm developing an R application running inside of a Java daemon on
    > multiple threads, and interacting with the parent daemon via stdin and
    > stdout.
    > Everything works perfectly fine except for having some
2010 Dec 23
1
speed issues? read R_inferno by Patrick Burns: & a memory query
Hi,
I'm just starting out with R and came across R_inferno.pdf by Patrick Burns
just yesterday - I recommend it!
His description of how 'growing' objects (e.g. obj <- c(obj,
additionalValue) eats up memory prompted me to rewrite a function (which
made such calls ~210 times) so that it used indexing into a dimensioned
object instead (i.e. obj[i, ] <- additionalValue).
This
Unnecessary extra copy with matrix(..., dimnames=NULL) (Was: Re: modifying large R objects in place)
2007 Sep 27
0
Unnecessary extra copy with matrix(..., dimnames=NULL) (Was: Re: modifying large R objects in place)
As others already mentioned, in your example you are first creating an
integer matrix and the coercing it to a double matrix by assigning
(double) 1 to element [1,1].  However, even when correcting for this
mistake, there is an extra copy created when using matrix().
Try this in a fresh vanilla R session:
> print(gc())
         used (Mb) gc trigger (Mb) max used (Mb)
Ncells 136684  3.7    
2004 Aug 18
1
Memory Problems in R
Hello everyone -
I have a couple of questions about memory management of large objects.
Thanks in advance for your response.
I'm running R version 1.9.1 on solaris 8, compiled as a 32 bit app. 
My system has 12.0 GB of memory, with usually ~ 11GB free.  I checked
system limits using ulimit, and there is nothing set that would limit
the maximum amount of memory for a process (with the
2002 Aug 06
2
Memory leak in R v1.5.1?
Hi,
I am trying to minimize a rather complex function of 5 parameters with 
gafit and nlm. Besides some problems with both optimization algorithms 
(with respect to consistantly generating similar results), I tried to 
run this optimization about a hundred times for yet two other parameters.
Unfortunately, as the log below shows, during that batch process R 
starts to eat up all my RAM,
2002 Apr 29
1
Garbage collection: RW1041
Have searched through the archives but have been unable to find any related
issues - hopefully I'm not bringing up an old topic.
Am using RW1041 on a Windows NT on a machine with 1Gb of memory.  Have a
function doit() that reads in a chunk of data using readBin, performs a
regression, saves out coeffs and then returns.  When using Rgui with the
default memory limit of 256Mb I'm able to
2008 Apr 07
0
Some memory questions: data.frame and lists.
Hi there,
I seek your expert opinion on the following memory related questions.  The
output below was gotten from R-2.6.2, compiled with
--enable-memory-profiling on Ubuntu Linux.
=======================================================================
>>> Code and output 1:
> gc( )
         used (Mb) gc trigger (Mb) max used (Mb)
Ncells 131180  7.1     350000 18.7   350000 18.7