Displaying 20 results from an estimated 3000 matches similar to: "Creating R packages, passing by reference and oo R."
2007 Mar 28
2
Suggestion for memory optimization and as.double() with friends
Hi,
when doing as.double() on an object that is already a double, the
object seems to be copied internally, doubling the memory requirement.
See example below. Same for as.character() etc. Is this intended?
Example:
% R --vanilla
> x <- double(1e7)
> gc()
used (Mb) gc trigger (Mb) max used (Mb)
Ncells 234019 6.3 467875 12.5 350000 9.4
Vcells 10103774 77.1
2005 Nov 15
1
cannot.allocate.memory.again and 32bit<--->64bit
hello!
------
i use 32bit.Linux(SuSe)Server, so i'm limited with 3.5Gb of memory
i demonstrate, that there is times to times a problem with allocating of
objects of large size, for example
0.state (no objects yet created)
------------------------------------
> gc()
used (Mb) gc trigger (Mb) max used (Mb)
Ncells 162070 4.4 350000 9.4 350000
2006 May 16
2
Large database help
Hello all.
I have a large .txt file whose variables are fixed-columns,
ie, variable V1 goes from columns 1 to 7, V2 from 8 to 23 etc.
This is a 60GB file with 90 variables and 60 million observations.
I'm working with a Pentium 4, 1GB RAM, Windows XP Pro.
I tried the following code just to see if I could work with 2 variables
but it seems not possible:
R : Copyright 2005, The R Foundation
2008 Jul 20
2
Erro: cannot allocate vector of size 216.0 Mb
Please,
I have a 2GB computer and a huge time-series to embedd, and i tried
increasing memory.limit() and memory.size(max=TRUE), but nothing.
Just before the command:
> memory.size(max=TRUE)
[1] 13.4375
> memory.limit()
[1] 1535.875
> gc()
used (Mb) gc trigger (Mb) max used (Mb)
Ncells 209552 5.6 407500 10.9 350000 9.4
Vcells 125966 1.0 786432 6.0 496686 3.8
2008 Mar 24
1
Cannot allocate large vectors (running out of memory?)
Hi.
As shown in the simplified example below, I'm having trouble allocating
memory for large vectors, even though it would appear that there is more
than enough memory available. That is, even with a memory limit of 1500 MB,
R 2.6.1 (Win) will allocate memory for a first vector of 285 MB, but not for
a second vector of the same size. Forcing garbage collection does not seem
2005 Dec 14
2
The fastest way to select and execute a few selected functions inside a function
Dear useRs?
I have the following problem! I have a function that calls one or more
functions, depending on the input parameters. I am searching for the fastest
way to select and execute the selected functions and return their results in
a list. The number of possible functions is 10, however usually only 2 are
selected (although sometimes more, even all).
For examples, if I have function
2006 Jan 26
1
maximizing available memory under windows XP
I have always been using ebitbin to set the 3GB switch in the windows
binary, but version 2.2.1 has this set as default (which I verified using
dumpbin). However, when I generate junk data to fill up my memory and read
the memory usage using gc(), it seems that I am not getting as good results
with 2.2.1 patched as I was with 2.2.0 after I edited the header. Under R
2.2.0 I was able to use over
2011 Jan 17
1
isoreg memory leak?
I believe there is a memory leak in isoreg in the current version of R,
as I believe the following shows
> gc()
used (Mb) gc trigger (Mb) max used (Mb)
Ncells 120405 3.3 350000 9.4 350000 9.4
Vcells 78639 0.6 786432 6.0 392463 3.0
> for(k in 1:100) {
+
+ y <- runif(10000)
+ isoreg(x,y)
+ }
> rm(x)
> rm(y)
> gc()
used (Mb) gc
2005 Jun 10
1
gc() and gc trigger
hello,
the question concerning to the memory used and g.c. after having removed
objects. What is wrong?
bevor
-------
> gc()
used (Mb) gc trigger (Mb) max
used (Mb)
Ncells 313142 8.4 1801024 48.1 1835812
49.1
Vcells 809238 6.2 142909728 1090.4 178426948 1361.3
hier all attached objects
2007 Jan 17
3
R.oo Destructors
Has anyone figured out how to create a destructor in R.oo?
How I'd like to use it: I have an object which opens a connection thru RODBC
(held as a private member) It would be nice if the connection closes automatically
(inside the destructor) when an object gets gc()'ed.
Thanks in advance.
Regards,
Ken
BTW, a >BIG< thanks to Henrik Bengtsson for creating the R.oo package!
Lucky
2008 Oct 04
3
environment and scoping
I haven't quite figured out how I can change the environment of a function.
I have a main function and want to use different auxillary functions, which I supply as parameter (or names). What I want to do is something like this:
main.fun=function(aux.fun,dat){
x <- 1
fun.dat()
}
aux.fun.one=function(){
mean(dat)+x
}
aux.fun.one=function(){
median(dat)-x
}
I don't want to
2009 Apr 26
6
Memory issues in R
How do people deal with R and memory issues?
I have tried using gc() to see how much memory is used at each step.
Scanned Crawley R-Book and all other R books I have available and the FAQ
on-line but no help really found.
Running WinXP Pro (32 bit) with 4 GB RAM.
One SATA drive pair is in RAID 0 configuration with 10000 MB allocated as
virtual memory.
I do have another machine
2010 Jul 07
3
Large discrepancies in the same object being saved to .RData
Hi developers,
After some investigation I have found there can be large discrepancies in the same object being saved as an external "xx.RData" file. The immediate repercussion of this is the possible increased size of your .RData workspace for no apparent reason.
The function and its three scenarios below highlight these discrepancies. Note that the object being returned is exactly
2005 Jul 12
1
allocation of large matrix failing
Hello, this is probably something silly which I am doing, but I cannot
understand why this allocation is not happening.
Here is a my C code which tries to allocate a list of size 333559, and
then a matrix of size 8*333559
I thought I might be running into memory problems, but R is not even
using that much (I start R with
more memory and it stays constant) Also, I start R as I normally do and
I
2006 May 05
1
converting code into a function - seperating a data frame with n columns into n individual vectors
I have many very large dataframes with 20 columns
each.
In order to conserve memory, I wish to separate the
data frame into 20 vectors, each named the name of the
dataframe followed by .1,.2,.3
.20.
(For example purposes, one data frame is named
?testa?.)
e.g. testa.1, testa.2, testa.3
I have written the code to do this (see below). I am
trying to convert this into a function that I can
reuse.
2010 Nov 04
1
Memory Management under Linux
Dear all,
I am using ubuntu linux 32 with 4 Gb. I am running a very small script and I always got the same error message: CAN NOT ALLOCATE A VECTOR OF SIZE 231.8 Mb.
I have reading carefully the instruction in ?Memory. Using the function gc() I got very low numbers of memory (please sea below). I know that it has been posted several times at r-help
2010 Nov 05
1
improve R memory under linux
Dear all,
I am using ubuntu linux 32 with 4 Gb. I am running a very small script and I always got the same error message: CAN NOT ALLOCATE A VECTOR OF SIZE 231.8 Mb.
I have reading carefully the instruction in ?Memory. Using the function gc() I got very low numbers of memory (please sea below). I know that it has been posted several times at r-help
2010 Nov 05
1
R memory allocation in Linux
Dear all,
I am using ubuntu linux 32 with 4 Gb. I am running a very small script and I always got the same error message: CAN NOT ALLOCATE A VECTOR OF SIZE 231.8 Mb.
I have reading carefully the instruction in ?Memory. Using the function gc() I got very low numbers of memory (please sea below). I know that it has been posted several times at r-help
2005 Jan 14
1
S3/S4 classes performance comparison
Hi R-devel,
If you did read my survey on Rhelp about reporting, you may have seen that
I am implementing a way to handle outputs for R (mainly target output
destinations: xHTML and TeX).
In fact: I does have something that works for basic objects, entirely done
with S4 classes, with the results visible at:
http://www.stat.ucl.ac.be/ROMA/sample.htm
http://www.stat.ucl.ac.be/ROMA/sample.pdf
To
2007 Mar 01
4
R File IO Slow?
Is R file IO slow in general or am I missing
something? It takes me 5 minutes to do a load(MYFILE)
where MYFILE is a 27 MB Rdata file. Is there any way
to speed this up?
The one idea I have is having R call a C or Perl
routine, reading the file in that language, converting
the data in to R objects, then sending them back into
R. This is more work that I want to do, however, in
loading Rdata