similar to: R memory issue for writing out the file

Displaying 20 results from an estimated 1100 matches similar to: "R memory issue for writing out the file"

2006 Jun 06
1
Ampersand Crashes Ruby
I''m using acts_as_ferret and when I call Object.find_by_contents("A & B"), Ruby dies with the following message: ^Cruby(5014,0xa000cf60) malloc: *** vm_allocate(size=1069056) failed (error code=3) ruby(5014,0xa000cf60) malloc: *** error: can''t allocate region ruby(5014,0xa000cf60) malloc: *** set a breakpoint in szone_error to debug ruby(5014,0xa000cf60) malloc:
2006 Apr 18
1
NoMemoryError
I am using the Openbase adapter and have had a similar glitch here and there, but after I go into production I consistently get an error on one page. ActionView::TemplateError (NoMemoryError: failed to allocate memory: SELECT * FROM ... I cannot track down the exact location of the error, but the production log says it was around:
2006 Jun 10
3
sparse matrix, rnorm, malloc
Hi, I'm Sorry for any cross-posting. I've reviewed the archives and could not find an exact answer to my question below. I'm trying to generate very large sparse matrices (< 1% non-zero entries per row). I have a sparse matrix function below which works well until the row/col count exceeds 10,000. This is being run on a machine with 32G memory: sparse_matrix <-
2005 Nov 13
1
Memory allocation (PR#8304)
Full_Name: Hans Kestler Version: 2.2.0 OS: 10.4.3 Submission from: (NULL) (84.156.184.101) > sam1.out<-sam(raw1[,2:23],raw1.cl,B=0,rand=124) We're doing 319770 complete permutations Error: cannot allocate vector of size 575586 Kb R(572,0xa000ed68) malloc: *** vm_allocate(size=589402112) failed (error code=3) R(572,0xa000ed68) malloc: *** error: can't allocate region
2008 Jan 10
1
OS X binary: 32 or 64-bit?
Dear R Experts, I am using R.app (the Mac OS X binary) for neuroimage analysis, so I am loading in some large image files. I get the following error in the middle of my script: > source("3dLME.R") Read 1 record Read 1 record Read 1 record Read 1 record Read 1 record Error: cannot allocate vector of size 3.1 Gb R(2081,0xa000d000) malloc: *** vm_allocate(size=3321675776) failed (error
2008 Mar 21
1
Memory Problem
Dear all, I am having a memory problem when analyzing a rather large data set with nested factors in R. The model is of the form X~A*B*(C/D/F) A,B,C,D,F being the independent variables some of which are nested. The problem occurs when using aov but also when using glm or lme. In particular I get the following response, Error: cannot allocate vector of size 1.6 Gb R(311,0xa000d000) malloc: ***
2006 Feb 01
2
memory limit in aov
I want to do an unbalanced anova on 272,992 observations with 405 factors including 2-way interactions between 1 of these factors and the other 404. After fitting only 11 factors and their interactions I get error messages like: Error: cannot allocate vector of size 1433066 Kb R(365,0xa000ed68) malloc: *** vm_allocate(size=1467461632) failed (error code=3) R(365,0xa000ed68) malloc: ***
2005 Jul 19
1
mac os x crashes with bioconductor microarray code (PR#8013)
Full_Name: Eric Libby Version: 2.1.1 OS: OS Tiger Submission from: (NULL) (65.93.158.117) I am trying to analyze microarray data of 42 human arrays. I typed in the following instructions: library(affy) Data <-ReadAffy() eset <- expresso(Data, normalize.method="invariantset", bg.correct=FALSE, pmcorrect.method="pmonly",summary.method="liwong") And I get some
2006 Aug 04
1
incorrect checksum for freed object?
I''m using ferret (0.9.4) in rails, but outside of the "acts_as_ferret" plugin. Whenever I use a QueryFilter (even a very simple one), the server will crash after one, two, or three reloads of a page (same page, same query, same filter). It''s very non-deterministic and I can''t seem to reproduce it outside of my application environment (I can''t get it
2008 Mar 26
2
pseudo R square and/or C statistic in R logistic regression
Dear all, I am now doing the logistic regression using R. (glm, family=binomial). Besides the standardize summary statistics generated from R, I am also interested in some more informations concerning the model fitting / prediction etc; Particularly I am interested in "pseudo R squar" and "C statistic". I searched the R- help and could only get very limited information. (Post
2005 Jul 20
2
(no subject)
Hi All, I want to print a square matrix of 7000 x 7000 into a text file. But I got a error after few hours of computation... -------- > write.table(MyDistMxDF, file = "temp.csv", sep=",", quote=F) *** malloc: vm_allocate(size=8421376) failed (error code=3) *** malloc[2889]: error: Can't allocate region Error: vector memory exhausted (limit reached?) *** malloc:
2006 Mar 08
1
malloc: vm_allocate(size=381886464) failed (error code=3)
Hi all, I am having memory allocation problem with my R 2.2.1 for Mac OS. The following is the error message that I get. I do not get this message if I break down the large dataset in to sub datasets. I think breaking up the dataset is not a sustainable solution in the long run. The data that I am analysing is essentially big, and it would be reasonable to do the analyis on the whole dataset
2000 Nov 09
2
SCO Openserver Patch
I downloaded the Openssh-2.3.0p1 file and ran configure only to have it fail reporting the lack of libz. I found that configure was failing beacuse the test program did not include the libtinfo and libm which are needed to resolve references in libprot. I made changes to the configure script to add those libraries for SCO Openserver and then tried to compile the programs only to run into another
2000 Jun 19
1
configure problem on UnixWare 7.1.1
Anyone can locate what was wrong with the below problem on UnixWare 7.1.1 ? The file rand.h was finally found in one of the tests but configure still failed with ... checking for getpagesize... yes checking for OpenSSL directory... configure: error: Could not find working SSLeay / OpenSSL libraries, please install Thanh configure:2302: gcc -o conftest -g -O2 -Wall -I/usr/local/include
2007 Jul 10
2
[LLVMdev] Accounting for stack space
On Tue, 10 Jul 2007, Sandro Magi wrote: >> used. Your choices are to either override malloc/free for both the JIT >> and the program or for neither of them. > > I want to 'intercept' ALL allocations actually, including the stack if > possible, so the above suits me just fine. Ok, just provide your own malloc/free. :) -Chris -- http://nondot.org/sabre/
2007 Mar 21
3
question on suppressing error messages with Rmath library
Dear list, I have been using the Rmath library for quite a while: in the current instance, I am calling dnt (non-central t density function) repeatedly for several million. When the argument is small, I get the warning message: full precision was not achieved in 'pnt' which is nothing unexpected. (The density calls pnt, if you look at the function dnt.) However, to have this happen a
2001 Oct 22
2
configure changes
I finally got around to looking at a bunch of patchs to configure.in, some of them from back in March. One from Carson Gaspar <carson at taltos.org> looked promissing at first glance but after many hours I just couldn't get it to work. Due to much demand, I have added optional PATH to --with-pcre, --with-zlib, and --with-tcp-wrappers. I have done extensive testin on --with-zlib, and
2004 Jul 09
3
bash as a login shell (was Root users shell == no existant shell /bin/bash)
On 9 Jul 2004 at 13:11, Daniel Brown wrote: > On the other hand, I've run across a sysadmin who always enables his > toor accounts -- and changes its shell to bash. As a result, not only > is there an alternate root account (good in case 'root' trampled on by > accident or purpose), but you can get root bash as a login shell while > leaving the real root to its normal
2010 Aug 09
0
[LLVMdev] MmapAllocator
On Sun, Aug 8, 2010 at 9:20 PM, Reid Kleckner <reid.kleckner at gmail.com>wrote: > On Sun, Aug 8, 2010 at 8:20 PM, Jakob Stoklund Olesen <stoklund at 2pi.dk> > wrote: > > > > On Aug 7, 2010, at 7:05 PM, Steven Noonan wrote: > >> I've been doing work on memory reduction in Unladen Swallow, and > >> during testing, LiveRanges seemed to be
2009 Apr 16
2
[LLVMdev] Help me improve two-address code
I have my new port limping enough to compile a very basic function: int foo (int a, int b, int c, int d) { return a + b - c + d; } clang-cc -O2 yields: define i32 @foo(i32 %a, i32 %b, i32 %c, i32 %d) nounwind readnone { entry: %add = add i32 %b, %a ; <i32> [#uses=1] %sub = sub i32 %add, %c ; <i32> [#uses=1] %add4 = add i32 %sub, %d ; <i32>