Displaying 20 results from an estimated 5000 matches similar to: "Memory leak with tons of closed connections"
2016 Nov 11
0
Memory leak with tons of closed connections
>>>>> Gergely Dar?czi <daroczig at rapporter.net>
>>>>> on Thu, 10 Nov 2016 16:48:12 +0100 writes:
> Dear All,
> I'm developing an R application running inside of a Java daemon on
> multiple threads, and interacting with the parent daemon via stdin and
> stdout.
> Everything works perfectly fine except for having some
2016 Nov 11
2
Memory leak with tons of closed connections
On Fri, Nov 11, 2016 at 12:08 PM, Martin Maechler
<maechler at stat.math.ethz.ch> wrote:
>>>>>> Gergely Dar?czi <daroczig at rapporter.net>
>>>>>> on Thu, 10 Nov 2016 16:48:12 +0100 writes:
>
> > Dear All,
> > I'm developing an R application running inside of a Java daemon on
> > multiple threads, and
2002 Aug 06
2
Memory leak in R v1.5.1?
Hi,
I am trying to minimize a rather complex function of 5 parameters with
gafit and nlm. Besides some problems with both optimization algorithms
(with respect to consistantly generating similar results), I tried to
run this optimization about a hundred times for yet two other parameters.
Unfortunately, as the log below shows, during that batch process R
starts to eat up all my RAM,
2000 Feb 11
1
astonishing memory phenomenon
I have a question concerning memory.
I understood that R takes a fixed amount of memory at startup (which I can
influence with --vsize --nsize) and that gc() shows the memory still free of
the total memory reserved for R.
However, if I create a long vector of character data, gc() only seem to
reflect the space needed for a vector of pointers to char, the space used
for the character data itself
2007 Mar 28
2
Suggestion for memory optimization and as.double() with friends
Hi,
when doing as.double() on an object that is already a double, the
object seems to be copied internally, doubling the memory requirement.
See example below. Same for as.character() etc. Is this intended?
Example:
% R --vanilla
> x <- double(1e7)
> gc()
used (Mb) gc trigger (Mb) max used (Mb)
Ncells 234019 6.3 467875 12.5 350000 9.4
Vcells 10103774 77.1
2010 Dec 23
1
speed issues? read R_inferno by Patrick Burns: & a memory query
Hi,
I'm just starting out with R and came across R_inferno.pdf by Patrick Burns
just yesterday - I recommend it!
His description of how 'growing' objects (e.g. obj <- c(obj,
additionalValue) eats up memory prompted me to rewrite a function (which
made such calls ~210 times) so that it used indexing into a dimensioned
object instead (i.e. obj[i, ] <- additionalValue).
This
2005 Jun 10
1
gc() and gc trigger
hello,
the question concerning to the memory used and g.c. after having removed
objects. What is wrong?
bevor
-------
> gc()
used (Mb) gc trigger (Mb) max
used (Mb)
Ncells 313142 8.4 1801024 48.1 1835812
49.1
Vcells 809238 6.2 142909728 1090.4 178426948 1361.3
hier all attached objects
2010 Jul 07
3
Large discrepancies in the same object being saved to .RData
Hi developers,
After some investigation I have found there can be large discrepancies in the same object being saved as an external "xx.RData" file. The immediate repercussion of this is the possible increased size of your .RData workspace for no apparent reason.
The function and its three scenarios below highlight these discrepancies. Note that the object being returned is exactly
2003 Dec 06
7
Windows Memory Issues
Hi all,
I am currently building an application based on R 1.7.1 (+ compiled
C/C++ code + MySql + VB). I am building this application to work on 2
different platforms (Windows XP Professional (500mb memory) and Windows
NT 4.0 with service pack 6 (1gb memory)). This is a very memory
intensive application performing sophisticated operations on "large"
matrices (typically 5000x1500
2004 Aug 18
1
Memory Problems in R
Hello everyone -
I have a couple of questions about memory management of large objects.
Thanks in advance for your response.
I'm running R version 1.9.1 on solaris 8, compiled as a 32 bit app.
My system has 12.0 GB of memory, with usually ~ 11GB free. I checked
system limits using ulimit, and there is nothing set that would limit
the maximum amount of memory for a process (with the
2010 Oct 10
2
GC verbose=false still showing report
I must be reading the help file for gc() wrong. I thought it said that
gc(verbose=FALSE) will run the garbage collection without printing the
Ncells/Vcells summary. However, this is what I get:
gc(verbose = FALSE)
used (Mb) gc trigger (Mb) max used (Mb)
Ncells 267097 14.3 531268 28.4 531268 28.4
Vcells 429302 3.3 20829406 159.0 55923977 426.7
I'm embedding this in an
2001 Mar 13
3
gc() shrinks with multiple iterations
Is it expected behavior for gc() to return shrinking values as it gets
called multiple times? Here's what I've got:
> gc()
used (Mb) gc trigger (Mb)
Ncells 221754 6.0 467875 12.5
Vcells 3760209 28.7 14880310 113.6
> gc()
used (Mb) gc trigger (Mb)
Ncells 221760 6.0 467875 12.5
Vcells 3016206 23.1 11904247 90.9
> gc()
used (Mb) gc
2002 Oct 11
1
growing process size in simulation
I came across this in a simulation I ran under 1.6.0: If I do something
like
R> x <- rnorm(10)
R> rval <- NULL
R> for(i in 1:100000) rval <- t.test(x)$p.value
then the process size remains at about 14M under 1.5.1, but it seems to
be almost linearly growing up to more than 100M under 1.6.0.
I know that the above simulation is nonsense, but it was the simplest I
could come up
2002 Apr 29
1
Garbage collection: RW1041
Have searched through the archives but have been unable to find any related
issues - hopefully I'm not bringing up an old topic.
Am using RW1041 on a Windows NT on a machine with 1Gb of memory. Have a
function doit() that reads in a chunk of data using readBin, performs a
regression, saves out coeffs and then returns. When using Rgui with the
default memory limit of 256Mb I'm able to
2008 Jan 30
1
Understanding an R improvement that already occurred.
I was surprised to observe the following difference between 2.4.1 and
2.6.0 after a long overdue upgrade a few months ago of our
departmental server. It wasn't a bug fix, but a subtle improvement.
Here's the simplest example I could create. The size is excessive, on
the order of the Netflix Competition data.
The integer matrix is about 1.12 GB, and if coerced to numeric it is
2.24 GB.
2008 Sep 24
2
cannot allocate memory
I am getting "Error: cannot allocate vector of size 197 MB".
I know that similar problems were discussed a lot already, but I
didn't find any satisfactory answers so far!
Details:
*** I have XP (32bit) with 4GB ram. At the time when the problem
appeared I had 1.5GB of available physical memory.
*** I increased R memory limit to 3GB via memory.limit(3000)
*** I did gs() and got
2009 Aug 18
1
Plyr and memory allocation issue
Dear R users
I am trying to create some new variables for a 4401 x 30 dataframe using
ddply and transform. The "id" variable i am using is a factor with 1330
levles eg
bb <- function(df) {transform(df,
years = study.year - min(study.year) + 1,
periods = length(study.year)
)}
test <- ddply(x,.(id),bb)
I havent copied the data to avoid clogging the
2008 Jul 20
2
Erro: cannot allocate vector of size 216.0 Mb
Please,
I have a 2GB computer and a huge time-series to embedd, and i tried
increasing memory.limit() and memory.size(max=TRUE), but nothing.
Just before the command:
> memory.size(max=TRUE)
[1] 13.4375
> memory.limit()
[1] 1535.875
> gc()
used (Mb) gc trigger (Mb) max used (Mb)
Ncells 209552 5.6 407500 10.9 350000 9.4
Vcells 125966 1.0 786432 6.0 496686 3.8
2009 Apr 26
6
Memory issues in R
How do people deal with R and memory issues?
I have tried using gc() to see how much memory is used at each step.
Scanned Crawley R-Book and all other R books I have available and the FAQ
on-line but no help really found.
Running WinXP Pro (32 bit) with 4 GB RAM.
One SATA drive pair is in RAID 0 configuration with 10000 MB allocated as
virtual memory.
I do have another machine
2010 Apr 05
3
Creating R packages, passing by reference and oo R.
Dear All,
I would like some advice on creating R packages, passing by reference and oo R.
I have created a package that is neither elegant nor extensible and rather cumbersome (it works). I would like to re write the code to make the package distributable (should it be of interest) and easy to maintain.
The package is for Bayesian model determination via a reversible jump algorithm and has