similar to: Erro: cannot allocate vector of size 216.0 Mb

Displaying 20 results from an estimated 700 matches similar to: "Erro: cannot allocate vector of size 216.0 Mb"

2008 Sep 24
2
cannot allocate memory
I am getting "Error: cannot allocate vector of size 197 MB". I know that similar problems were discussed a lot already, but I didn't find any satisfactory answers so far! Details: *** I have XP (32bit) with 4GB ram. At the time when the problem appeared I had 1.5GB of available physical memory. *** I increased R memory limit to 3GB via memory.limit(3000) *** I did gs() and got
2006 May 22
1
win2k memory problem with merge()'ing repeatedly (long email)
Good afternoon, I have a 63 small .csv files which I process daily, and until two weeks ago they processed just fine and only took a matter of moments and had non noticeable memory problem. Two weeks ago they have reached 318 lines and my script "broke". There are some missing-values in some of the files. I have tried hard many times over the last two weeks to create a
2009 Apr 26
6
Memory issues in R
How do people deal with R and memory issues? I have tried using gc() to see how much memory is used at each step. Scanned Crawley R-Book and all other R books I have available and the FAQ on-line but no help really found. Running WinXP Pro (32 bit) with 4 GB RAM. One SATA drive pair is in RAID 0 configuration with 10000 MB allocated as virtual memory. I do have another machine
2007 Jun 26
1
Memory Experimentation: Rule of Thumb = 10-15 Times the Memory
dear R experts: I am of course no R experts, but use it regularly. I thought I would share some experimentation with memory use. I run a linux machine with about 4GB of memory, and R 2.5.0. upon startup, gc() reports used (Mb) gc trigger (Mb) max used (Mb) Ncells 268755 14.4 407500 21.8 350000 18.7 Vcells 139137 1.1 786432 6.0 444750 3.4 This is my baseline. linux
2010 Jul 07
3
Large discrepancies in the same object being saved to .RData
Hi developers, After some investigation I have found there can be large discrepancies in the same object being saved as an external "xx.RData" file. The immediate repercussion of this is the possible increased size of your .RData workspace for no apparent reason. The function and its three scenarios below highlight these discrepancies. Note that the object being returned is exactly
2006 Nov 06
2
gc()$Vcells < 0 (PR#9345)
Full_Name: Don Maszle Version: 2.3.0 OS: x86_64-unknown-linux-gnu Submission from: (NULL) (206.86.87.3) # On our new 32 GB x86_64 machine R : Copyright 2006, The R Foundation for Statistical Computing Version 2.3.0 (2006-04-24) ISBN 3-900051-07-0 R is free software and comes with ABSOLUTELY NO WARRANTY. You are welcome to redistribute it under certain conditions. Type 'license()' or
2002 Oct 11
1
growing process size in simulation
I came across this in a simulation I ran under 1.6.0: If I do something like R> x <- rnorm(10) R> rval <- NULL R> for(i in 1:100000) rval <- t.test(x)$p.value then the process size remains at about 14M under 1.5.1, but it seems to be almost linearly growing up to more than 100M under 1.6.0. I know that the above simulation is nonsense, but it was the simplest I could come up
2007 Aug 09
1
Memory Experimentation: Rule of Thumb = 10-15 Times the Memory
Hi, I've been having similar experiences and haven't been able to substantially improve the efficiency using the guidance in the I/O Manual. Could anyone advise on how to improve the following scan()? It is not based on my real file, please assume that I do need to read in characters, and can't do any pre-processing of the file, etc. ## Create Sample File
2006 Oct 30
1
nlme Error: Subscript out of bounds
Hello, I am new to non-linear growth modelling in R and I am trying to reproduce an analysis that was done (successfully) in S-Plus. I have a simple non-linear growth model, with no nesting. I have attempted to simplify the call as much as possible (by creating another grouped object, instead of using subset= and compacting the fixed and random expressions.) This is a what the grouped
2007 Jan 17
3
R.oo Destructors
Has anyone figured out how to create a destructor in R.oo? How I'd like to use it: I have an object which opens a connection thru RODBC (held as a private member) It would be nice if the connection closes automatically (inside the destructor) when an object gets gc()'ed. Thanks in advance. Regards, Ken BTW, a >BIG< thanks to Henrik Bengtsson for creating the R.oo package! Lucky
2007 Jun 27
2
Meta-Analysis of proportions
Dear colleagues, I'm conducting a meta-analysis of studies evaluating adherence of HIV-positive drug users into AIDS treatment, therefore I'm looking for some advice and syntax suggestion for running the meta-regression using proportions, not the usual OR/RR frequently used on RCT studies. Have already searched already several handbooks, R-manuals, mailing lists, professors, but... not
2005 Dec 14
2
The fastest way to select and execute a few selected functions inside a function
Dear useRs? I have the following problem! I have a function that calls one or more functions, depending on the input parameters. I am searching for the fastest way to select and execute the selected functions and return their results in a list. The number of possible functions is 10, however usually only 2 are selected (although sometimes more, even all). For examples, if I have function
2008 Mar 24
1
Cannot allocate large vectors (running out of memory?)
Hi. As shown in the simplified example below, I'm having trouble allocating memory for large vectors, even though it would appear that there is more than enough memory available. That is, even with a memory limit of 1500 MB, R 2.6.1 (Win) will allocate memory for a first vector of 285 MB, but not for a second vector of the same size. Forcing garbage collection does not seem
2006 May 16
2
Large database help
Hello all. I have a large .txt file whose variables are fixed-columns, ie, variable V1 goes from columns 1 to 7, V2 from 8 to 23 etc. This is a 60GB file with 90 variables and 60 million observations. I'm working with a Pentium 4, 1GB RAM, Windows XP Pro. I tried the following code just to see if I could work with 2 variables but it seems not possible: R : Copyright 2005, The R Foundation
2004 Jul 19
2
Evaluating the Yield of Medical Tests
Hello, I'm a biostatistician in Toronto. I would like to know if there is anything in survival analysis developed in R for the method "Evaluating the Yield of Medical Test" (JAMA. May 14,1982--Vol 247, No.18 Frank E. Harrell, Jr,PhD; Robert M. Califf, MD; David B. Pryor, MD;Kerry L.Lee, PhD; Robert A. Rosait,MD.) Hope to hear from you and thanks Lisa Wang, MSc Project Organiser
2006 May 05
1
converting code into a function - seperating a data frame with n columns into n individual vectors
I have many very large dataframes with 20 columns each. In order to conserve memory, I wish to separate the data frame into 20 vectors, each named the name of the dataframe followed by .1,.2,.3 .20. (For example purposes, one data frame is named ?testa?.) e.g. testa.1, testa.2, testa.3 I have written the code to do this (see below). I am trying to convert this into a function that I can reuse.
2010 Apr 19
2
plotting RR, 95% CI as table and figure in same plot
Hi all-- I am in the process of helping colleagues write up a ms in which we fit zero-inflated Poisson models. I would prefer plotting the rate ratios and 95% CI (as I've found Gelman and others convincing about plotting tables...), but our journals usually like the numbers themselves. Thus, I'm looking at a recent JAMA article in which both numbers and dotplot of RR and 95% CI are
2010 Aug 31
2
Error: cannot allocate vector of size 198.4 Mb
Hi, All I have a problem of R memory space. I am getting "Error: cannot allocate vector of size 198.4 Mb" ------------------------------ I've tried with: > memory.limit(size=2047); [1] 2047 > memory.size(max=TRUE); [1] 12.75 > library('RODBC'); > Channel<-odbcConnectAccess('c:/test.MDB'); # inputdata:15 cols, 2000000
2005 Nov 15
1
cannot.allocate.memory.again and 32bit<--->64bit
hello! ------ i use 32bit.Linux(SuSe)Server, so i'm limited with 3.5Gb of memory i demonstrate, that there is times to times a problem with allocating of objects of large size, for example 0.state (no objects yet created) ------------------------------------ > gc() used (Mb) gc trigger (Mb) max used (Mb) Ncells 162070 4.4 350000 9.4 350000
2006 May 12
4
bitwise addition
Hello all again, I want to do bitwise addition in R. I am trying to generate a matrix 0000 0001 0010 .... .... 1111 I know the other ways of generating this matrix but I need to look at bitwise addition. Any suggestions??? thanks a lot Nameeta ------------------------------------------------- This email is intended only for the use of the individual or...{{dropped}}