similar to: cannot allocate vector of size in merge (PR#765)

Displaying 20 results from an estimated 1000 matches similar to: "cannot allocate vector of size in merge (PR#765)"

2005 Jun 29
3
Memory Management under Linux: Problems to allocate large amounts of data
Dear Group I'm still trying to bring many data into R (see older postings). After solving some troubles with the database I do most of the work in MySQL. But still I could be nice to work on some data using R. Therefore I can use a dedicated Server with Gentoo Linux as OS hosting only R. This Server is a nice machine with two CPU and 4GB RAM which should do the job: Dual Intel XEON 3.06 GHz
2000 Nov 09
3
maximum of nsize=20000k ??
Dear R-ers, somehow it is not possible to increase nsize to more than 20000k. When I specify e.g. > R --vsize=10M --nsize=21000K the result is: free total (Mb) Ncells 99658 350000 6.7 Vcells 1219173 1310720 10.0 Maybe I have overlooked s.th.... Marcus -- +------------------------------------------------------- | Marcus Eger | E-Mail: eger.m at gmx.de (NEW) |
1999 May 15
2
vsize and nsize
I am running R version ??? under Redhat 5.2. It seems as though the --nsize object has no effct on the size of the allocated Ncells as determined using gc(). Yes, I have that much data.... That is if I envoke R with R --vsize 100 --nsize 5000000 then type gc() I get free total Ncells 92202 200000 Vcells 12928414 13107200 Thanks Tony Long Ecology and Evolutionary Biology Steinhaus
2004 Mar 08
2
memory problem
I am trying to upload into R 143 Affymetrix chips onto using R on the NIH Nimbus server. I can load 10 chips without a problem, however, when I try to load 143 I receive a error message: cannot create a vector of 523263 KB. I have expanded the memory of R as follows: R --min-vsize=10M --max-vsize=2500M --min-nsize=10M -max-nsize=50M (as specified in help in R). After running this command the
2009 Jul 01
3
"Error: cannot allocate vector of size 332.3 Mb"
Dear R-helpers, I am running R version 2.9.1 on a Mac Quad with 32Gb of RAM running Mac OS X version 10.5.6. With over 20Gb of RAM "free" (according to the Activity Monitor) the following happens. > x <- matrix(rep(0, 6600^2), ncol = 6600) # So far so good. But I need 3 matrices of this size. > y <- matrix(rep(0, 6600^2), ncol = 6600) R(3219) malloc: ***
2015 Jan 15
2
default min-v/nsize parameters
Just wanted to start a discussion on whether R could ship with more appropriate GC parameters. Right now, loading the recommended package Matrix leads to: > library(Matrix) > gc() used (Mb) gc trigger (Mb) max used (Mb) Ncells 1076796 57.6 1368491 73.1 1198505 64.1 Vcells 1671329 12.8 2685683 20.5 1932418 14.8 Results may vary, but here R needed 64MB of N cells and 15MB
2004 Aug 18
1
Memory Problems in R
Hello everyone - I have a couple of questions about memory management of large objects. Thanks in advance for your response. I'm running R version 1.9.1 on solaris 8, compiled as a 32 bit app. My system has 12.0 GB of memory, with usually ~ 11GB free. I checked system limits using ulimit, and there is nothing set that would limit the maximum amount of memory for a process (with the
2000 Aug 25
3
unexpected R crash - again
Sorry, but I lost this thread, so I sending this as a new message. This is really a follow-up to a post from a couple days ago saying that fisher.test from the ctest library crashed on the following data set: > T [,1] [,2] [1,] 2 1 [2,] 2 1 [3,] 4 0 [4,] 8 0 [5,] 6 0 [6,] 0 0 [7,] 1 0 [8,] 1 1 [9,] 7 1 [10,] 8 2 [11,]
2010 May 20
1
ERROR: cannot allocate vector of size?
I've looked through all of the posts about this issue (and there are plenty!) but I am still unable to solve the error. ERROR: cannot allocate vector of size 455 Mb I am using R 2.6.2 - x86_64 on a Linux x86_64 Redhat cluster system. When I log in, based on the specs I provide [qsub -I -X -l arch=x86_64] I am randomly assigned to a x86_64 node. I am using package GenABEL. My data (~ 650,000
2010 Nov 04
1
Memory Management under Linux
Dear all, I am using ubuntu linux 32 with 4 Gb.  I am running a very small script and I always got the same error message:  CAN NOT ALLOCATE A VECTOR OF SIZE 231.8 Mb. I have reading carefully the instruction in ?Memory.  Using the function gc() I got very low numbers of memory (please sea below).  I know that it has been posted several times at r-help
2010 Nov 05
1
improve R memory under linux
Dear all, I am using ubuntu linux 32 with 4 Gb.  I am running a very small script and I always got the same error message:  CAN NOT ALLOCATE A VECTOR OF SIZE 231.8 Mb. I have reading carefully the instruction in ?Memory.  Using the function gc() I got very low numbers of memory (please sea below).  I know that it has been posted several times at r-help
2010 Nov 05
1
R memory allocation in Linux
Dear all, I am using ubuntu linux 32 with 4 Gb.  I am running a very small script and I always got the same error message:  CAN NOT ALLOCATE A VECTOR OF SIZE 231.8 Mb. I have reading carefully the instruction in ?Memory.  Using the function gc() I got very low numbers of memory (please sea below).  I know that it has been posted several times at r-help
1999 Nov 12
1
R-0.65.1 Startup
Dear R users, I have noticed that my R startup is extremely slow. It takes almost 3 minutes from "double-click" to R prompt. I have been running R-0.64.1 till recently and it took about 30 sec. I still have access to R-0.64.1. When I started it up, it took about 25 sec. Can anyone tell me if this is a bug in R or a problem with my machine? Note: This is after bootup with R being the
2011 Jan 17
3
"cannot allocate vector of size ..." in RHLE5 PAE kernel
Dear R community, I'm running R 32 bits in a 64-bits machine (with 16Gb of Ram) using a PAE kernel, as you can see here: $ uname -a Linux mymachine 2.6.18-238.el5PAE #1 SMP Sun Dec 19 14:42:44 EST 2010 i686 i686 i386 GNU/Linux When I try to create a large matrix ( Q.obs <- matrix(NA, nrow=6940, ncol=9000) ), I got the following error: > Error: cannot allocate vector of size 238.3
2017 Nov 22
2
function pointers?
We have a project that calls for the creation of a list of many distribution objects. Distributions can be of various types, with various parameters, but we ran into some problems. I started testing on a simple list of rnorm-based objects. I was a little surprised at the RAM storage requirements, here's an example: N <- 10000 closureList <- vector("list", N) nsize = sample(x
2007 Oct 28
1
tree problem
I am trying to use tree to partition a data set. The data set has 3924 observations. Partitioning seems to work for small subsets of the data, but when I use the entire data set, no partitioning occurs. The variables are: RESP respondent to a survey (0 = not a respondent, 1 = respondent) AGE_P Age (continuous) ORIGIN_I Hispanic Ethnicity (1 = Hispanic, 2 = non-Hispanic) RACRECI2 Race
2005 Jul 07
2
r: LOOPING
hi all i know that one should try and limit the amount of looping in R programs. i have supplied some code below. i am interested in seeing how the code cold be rewritten if we dont use the loops. a brief overview of what is done in the code. ============================================== ============================================== ============================================== 1. the input
2000 May 30
6
heap size trouble
Hi , I ''ve got a trouble with using R. When I want to load a file that contains 93 thousand raws and 22 colums of data (essentially float) R shows me this error message "heap size trouble" Does anyone could tell me what parameter shall I precise before launching R in order to load my big file. Thanks a lot -------------- next part -------------- A non-text attachment was
2000 Oct 02
3
R vs S-PLUS with regard to memory usage
I am trying to translate code from S-PLUS to R and R really struggles! After starting R with the foll. R --vsize 50M --nsize 6M --no-restore on a 400 MHz Pentium with 192 MB of memory running Linux (RH 6.2), I run a function that essentially picks up an external dataset with 2121 rows and 30 columns and builds a lm() object and also runs step() ... the step() takes forever to run...(takes very
2010 Jan 19
2
Server hanging despite efforts to correct memory limits
My group is working with datasets between 100 Mb and 1 GB in size, using multiple log ins. From the documentation, it appears that vsize is limited to 2^30-1, which tends to prove too restrictive for our use. When we drop that restriction (set vsize = NA) we end up hanging the server, which requires a restart. Is there any way to increase the memory limits on R while keeping our jobs from