similar to: 'methods' and environments.

Displaying 20 results from an estimated 3000 matches similar to: "'methods' and environments."

2004 Aug 18
1
Memory Problems in R
Hello everyone - I have a couple of questions about memory management of large objects. Thanks in advance for your response. I'm running R version 1.9.1 on solaris 8, compiled as a 32 bit app. My system has 12.0 GB of memory, with usually ~ 11GB free. I checked system limits using ulimit, and there is nothing set that would limit the maximum amount of memory for a process (with the
2009 Apr 26
6
Memory issues in R
How do people deal with R and memory issues? I have tried using gc() to see how much memory is used at each step. Scanned Crawley R-Book and all other R books I have available and the FAQ on-line but no help really found. Running WinXP Pro (32 bit) with 4 GB RAM. One SATA drive pair is in RAID 0 configuration with 10000 MB allocated as virtual memory. I do have another machine
2002 Apr 29
1
Garbage collection: RW1041
Have searched through the archives but have been unable to find any related issues - hopefully I'm not bringing up an old topic. Am using RW1041 on a Windows NT on a machine with 1Gb of memory. Have a function doit() that reads in a chunk of data using readBin, performs a regression, saves out coeffs and then returns. When using Rgui with the default memory limit of 256Mb I'm able to
2010 Oct 10
2
GC verbose=false still showing report
I must be reading the help file for gc() wrong. I thought it said that gc(verbose=FALSE) will run the garbage collection without printing the Ncells/Vcells summary. However, this is what I get: gc(verbose = FALSE) used (Mb) gc trigger (Mb) max used (Mb) Ncells 267097 14.3 531268 28.4 531268 28.4 Vcells 429302 3.3 20829406 159.0 55923977 426.7 I'm embedding this in an
2003 Dec 06
7
Windows Memory Issues
Hi all, I am currently building an application based on R 1.7.1 (+ compiled C/C++ code + MySql + VB). I am building this application to work on 2 different platforms (Windows XP Professional (500mb memory) and Windows NT 4.0 with service pack 6 (1gb memory)). This is a very memory intensive application performing sophisticated operations on "large" matrices (typically 5000x1500
2003 Sep 11
2
(structured) programming style
I find that because R functions are call by value, and because there are no pointer or reference types (a la C++), I am making fairly heavy use of lexical scoping to modify variables. E.g. outer <- function() { m <- matrix(0, 2, 2) inner <- function() { m[2,2] <<- 3 ... } } I am not too pleased with this, as it violates basic rules of structured programming, namely
2004 Aug 07
1
memory usage of S4 methods
Hi, I have some problems with the memory usage of S4-generics. For example, I observed the following behaviour: > gc() used (Mb) gc trigger (Mb) Ncells 432091 11.6 531268 14.2 Vcells 116052 0.9 786432 6.0 > setClass("A",representation(x="numeric")); [1] "A" > setClass("B",representation(x="numeric")); [1] "B"
2003 Jul 01
1
Warning message in scatter.smooth (modreg)
Dear list, In using the scatter.smooth() function (modreg) on a small data set (100 obs) the following error was produced: > scatter.smooth(Na, S) Warning message: k-d tree limited by memory. ncmax= 200 I haven't used scatter.smooth much but when I have, I haven't seen this message before. gc() returns > gc() used (Mb) gc trigger (Mb) Ncells 417693 11.2 667722
2005 Jan 14
1
S3/S4 classes performance comparison
Hi R-devel, If you did read my survey on Rhelp about reporting, you may have seen that I am implementing a way to handle outputs for R (mainly target output destinations: xHTML and TeX). In fact: I does have something that works for basic objects, entirely done with S4 classes, with the results visible at: http://www.stat.ucl.ac.be/ROMA/sample.htm http://www.stat.ucl.ac.be/ROMA/sample.pdf To
2003 Oct 28
1
Loading a "sub-package"
Hi Folks, The inspiration for this query is described below, but it prompts a general question: If one wants to use only one or a few functions from a library, is there a way to load only these, without loading the library, short of going into the package source and extracting what is needed (including of course any auxiliary functions and compiled code they may depend on)? What prompted this
2007 Mar 01
4
R File IO Slow?
Is R file IO slow in general or am I missing something? It takes me 5 minutes to do a load(MYFILE) where MYFILE is a 27 MB Rdata file. Is there any way to speed this up? The one idea I have is having R call a C or Perl routine, reading the file in that language, converting the data in to R objects, then sending them back into R. This is more work that I want to do, however, in loading Rdata
2000 Feb 11
1
astonishing memory phenomenon
I have a question concerning memory. I understood that R takes a fixed amount of memory at startup (which I can influence with --vsize --nsize) and that gc() shows the memory still free of the total memory reserved for R. However, if I create a long vector of character data, gc() only seem to reflect the space needed for a vector of pointers to char, the space used for the character data itself
2007 Mar 28
2
Suggestion for memory optimization and as.double() with friends
Hi, when doing as.double() on an object that is already a double, the object seems to be copied internally, doubling the memory requirement. See example below. Same for as.character() etc. Is this intended? Example: % R --vanilla > x <- double(1e7) > gc() used (Mb) gc trigger (Mb) max used (Mb) Ncells 234019 6.3 467875 12.5 350000 9.4 Vcells 10103774 77.1
2016 Nov 10
2
Memory leak with tons of closed connections
Dear All, I'm developing an R application running inside of a Java daemon on multiple threads, and interacting with the parent daemon via stdin and stdout. Everything works perfectly fine except for having some memory leaks somewhere. Simplified version of the R app: while (TRUE) { con <- file('stdin', open = 'r', blocking = TRUE) line <- scan(con,
2010 Dec 23
1
speed issues? read R_inferno by Patrick Burns: & a memory query
Hi, I'm just starting out with R and came across R_inferno.pdf by Patrick Burns just yesterday - I recommend it! His description of how 'growing' objects (e.g. obj <- c(obj, additionalValue) eats up memory prompted me to rewrite a function (which made such calls ~210 times) so that it used indexing into a dimensioned object instead (i.e. obj[i, ] <- additionalValue). This
2007 Aug 16
2
Possible memory leak with R v.2.5.0
I'm working with a very large matrix ( 22k rows x 2k cols) of RNA expression data with R v.2.5.0 on a RedHat Enterprise machine, x86_64 architecture. The relevant code is below, but I call a function that takes a cluster of this data ( a list structure that contains a $rows elt which lists the rows (genes ) in the cluster by ID, but not the actual data itself ). The
2005 Jun 10
1
gc() and gc trigger
hello, the question concerning to the memory used and g.c. after having removed objects. What is wrong? bevor ------- > gc() used (Mb) gc trigger (Mb) max used (Mb) Ncells 313142 8.4 1801024 48.1 1835812 49.1 Vcells 809238 6.2 142909728 1090.4 178426948 1361.3 hier all attached objects
2001 Mar 13
3
gc() shrinks with multiple iterations
Is it expected behavior for gc() to return shrinking values as it gets called multiple times? Here's what I've got: > gc() used (Mb) gc trigger (Mb) Ncells 221754 6.0 467875 12.5 Vcells 3760209 28.7 14880310 113.6 > gc() used (Mb) gc trigger (Mb) Ncells 221760 6.0 467875 12.5 Vcells 3016206 23.1 11904247 90.9 > gc() used (Mb) gc
2002 Oct 11
1
growing process size in simulation
I came across this in a simulation I ran under 1.6.0: If I do something like R> x <- rnorm(10) R> rval <- NULL R> for(i in 1:100000) rval <- t.test(x)$p.value then the process size remains at about 14M under 1.5.1, but it seems to be almost linearly growing up to more than 100M under 1.6.0. I know that the above simulation is nonsense, but it was the simplest I could come up
2002 Aug 06
2
Memory leak in R v1.5.1?
Hi, I am trying to minimize a rather complex function of 5 parameters with gafit and nlm. Besides some problems with both optimization algorithms (with respect to consistantly generating similar results), I tried to run this optimization about a hundred times for yet two other parameters. Unfortunately, as the log below shows, during that batch process R starts to eat up all my RAM,