similar to: Understanding an R improvement that already occurred.

Displaying 20 results from an estimated 2000 matches similar to: "Understanding an R improvement that already occurred."

2007 Mar 28
2
Suggestion for memory optimization and as.double() with friends
Hi, when doing as.double() on an object that is already a double, the object seems to be copied internally, doubling the memory requirement. See example below. Same for as.character() etc. Is this intended? Example: % R --vanilla > x <- double(1e7) > gc() used (Mb) gc trigger (Mb) max used (Mb) Ncells 234019 6.3 467875 12.5 350000 9.4 Vcells 10103774 77.1
2010 Nov 23
2
Error: cannot allocate vector of size x Gb (64-bit ... yet again)
Hello, I am facing the dreaded "Error: cannot allocate vector of size x Gb" and don't understand enough about R (or operating system) memory management to diagnose and solve the problem -- despite studying previous posts and relevant R help -- e.g.: "Error messages beginning cannot allocate vector of size indicate a failure to obtain memory, either because the size exceeded
2006 Nov 06
2
gc()$Vcells < 0 (PR#9345)
Full_Name: Don Maszle Version: 2.3.0 OS: x86_64-unknown-linux-gnu Submission from: (NULL) (206.86.87.3) # On our new 32 GB x86_64 machine R : Copyright 2006, The R Foundation for Statistical Computing Version 2.3.0 (2006-04-24) ISBN 3-900051-07-0 R is free software and comes with ABSOLUTELY NO WARRANTY. You are welcome to redistribute it under certain conditions. Type 'license()' or
2006 Jan 26
1
maximizing available memory under windows XP
I have always been using ebitbin to set the 3GB switch in the windows binary, but version 2.2.1 has this set as default (which I verified using dumpbin). However, when I generate junk data to fill up my memory and read the memory usage using gc(), it seems that I am not getting as good results with 2.2.1 patched as I was with 2.2.0 after I edited the header. Under R 2.2.0 I was able to use over
2005 Jun 10
1
gc() and gc trigger
hello, the question concerning to the memory used and g.c. after having removed objects. What is wrong? bevor ------- > gc() used (Mb) gc trigger (Mb) max used (Mb) Ncells 313142 8.4 1801024 48.1 1835812 49.1 Vcells 809238 6.2 142909728 1090.4 178426948 1361.3 hier all attached objects
2005 Nov 15
1
cannot.allocate.memory.again and 32bit<--->64bit
hello! ------ i use 32bit.Linux(SuSe)Server, so i'm limited with 3.5Gb of memory i demonstrate, that there is times to times a problem with allocating of objects of large size, for example 0.state (no objects yet created) ------------------------------------ > gc() used (Mb) gc trigger (Mb) max used (Mb) Ncells 162070 4.4 350000 9.4 350000
2010 Jul 07
3
Large discrepancies in the same object being saved to .RData
Hi developers, After some investigation I have found there can be large discrepancies in the same object being saved as an external "xx.RData" file. The immediate repercussion of this is the possible increased size of your .RData workspace for no apparent reason. The function and its three scenarios below highlight these discrepancies. Note that the object being returned is exactly
2008 Mar 24
1
Cannot allocate large vectors (running out of memory?)
Hi. As shown in the simplified example below, I'm having trouble allocating memory for large vectors, even though it would appear that there is more than enough memory available. That is, even with a memory limit of 1500 MB, R 2.6.1 (Win) will allocate memory for a first vector of 285 MB, but not for a second vector of the same size. Forcing garbage collection does not seem
2007 Jun 26
1
Memory Experimentation: Rule of Thumb = 10-15 Times the Memory
dear R experts: I am of course no R experts, but use it regularly. I thought I would share some experimentation with memory use. I run a linux machine with about 4GB of memory, and R 2.5.0. upon startup, gc() reports used (Mb) gc trigger (Mb) max used (Mb) Ncells 268755 14.4 407500 21.8 350000 18.7 Vcells 139137 1.1 786432 6.0 444750 3.4 This is my baseline. linux
2006 May 16
2
Large database help
Hello all. I have a large .txt file whose variables are fixed-columns, ie, variable V1 goes from columns 1 to 7, V2 from 8 to 23 etc. This is a 60GB file with 90 variables and 60 million observations. I'm working with a Pentium 4, 1GB RAM, Windows XP Pro. I tried the following code just to see if I could work with 2 variables but it seems not possible: R : Copyright 2005, The R Foundation
2008 Jul 20
2
Erro: cannot allocate vector of size 216.0 Mb
Please, I have a 2GB computer and a huge time-series to embedd, and i tried increasing memory.limit() and memory.size(max=TRUE), but nothing. Just before the command: > memory.size(max=TRUE) [1] 13.4375 > memory.limit() [1] 1535.875 > gc() used (Mb) gc trigger (Mb) max used (Mb) Ncells 209552 5.6 407500 10.9 350000 9.4 Vcells 125966 1.0 786432 6.0 496686 3.8
2005 Dec 14
2
The fastest way to select and execute a few selected functions inside a function
Dear useRs? I have the following problem! I have a function that calls one or more functions, depending on the input parameters. I am searching for the fastest way to select and execute the selected functions and return their results in a list. The number of possible functions is 10, however usually only 2 are selected (although sometimes more, even all). For examples, if I have function
2011 Jun 04
1
R Crashes when using "large" matrices (Ubuntu 11.04)
Sorry for re-posting, but the original one ended up inside a previous and unrelated thread. -- Matias ----- Hello, This simple SVD calculation (commands are copied immediately below) crashes on my Ubuntu machine (R 2.13.0). However it works fine on my Windows 7 machine, so I suspect there's a problem with (my?) Ubuntu and / or R. Can anybody else reproduce it (with Ubuntu 11.04)? Thanks
2008 Sep 24
2
cannot allocate memory
I am getting "Error: cannot allocate vector of size 197 MB". I know that similar problems were discussed a lot already, but I didn't find any satisfactory answers so far! Details: *** I have XP (32bit) with 4GB ram. At the time when the problem appeared I had 1.5GB of available physical memory. *** I increased R memory limit to 3GB via memory.limit(3000) *** I did gs() and got
2008 Sep 09
1
Memory allocation problem (during kmeans)
Dear all, I am trying to apply kmeans clusterring on a data file (size is about 300 Mb) I read this file using x=read.table('file path' , sep=" ") then i do kmeans(x,25) but the process stops after two minutes with an error : Error: cannot allocate vector of size 907.3 Mb when i read the archive i notice that the best solution is to use a 64bit OS. "Error messages
2000 Nov 09
3
maximum of nsize=20000k ??
Dear R-ers, somehow it is not possible to increase nsize to more than 20000k. When I specify e.g. > R --vsize=10M --nsize=21000K the result is: free total (Mb) Ncells 99658 350000 6.7 Vcells 1219173 1310720 10.0 Maybe I have overlooked s.th.... Marcus -- +------------------------------------------------------- | Marcus Eger | E-Mail: eger.m at gmx.de (NEW) |
2011 Jan 17
1
isoreg memory leak?
I believe there is a memory leak in isoreg in the current version of R, as I believe the following shows > gc() used (Mb) gc trigger (Mb) max used (Mb) Ncells 120405 3.3 350000 9.4 350000 9.4 Vcells 78639 0.6 786432 6.0 392463 3.0 > for(k in 1:100) { + + y <- runif(10000) + isoreg(x,y) + } > rm(x) > rm(y) > gc() used (Mb) gc
2008 Oct 04
3
environment and scoping
I haven't quite figured out how I can change the environment of a function. I have a main function and want to use different auxillary functions, which I supply as parameter (or names). What I want to do is something like this: main.fun=function(aux.fun,dat){ x <- 1 fun.dat() } aux.fun.one=function(){ mean(dat)+x } aux.fun.one=function(){ median(dat)-x } I don't want to
2007 Aug 16
2
Possible memory leak with R v.2.5.0
I'm working with a very large matrix ( 22k rows x 2k cols) of RNA expression data with R v.2.5.0 on a RedHat Enterprise machine, x86_64 architecture. The relevant code is below, but I call a function that takes a cluster of this data ( a list structure that contains a $rows elt which lists the rows (genes ) in the cluster by ID, but not the actual data itself ). The
2010 Apr 05
3
Creating R packages, passing by reference and oo R.
Dear All, I would like some advice on creating R packages, passing by reference and oo R. I have created a package that is neither elegant nor extensible and rather cumbersome (it works). I would like to re write the code to make the package distributable (should it be of interest) and easy to maintain. The package is for Bayesian model determination via a reversible jump algorithm and has