similar to: sparse matrix, rnorm, malloc

Displaying 20 results from an estimated 3000 matches similar to: "sparse matrix, rnorm, malloc"

2006 Jun 06
1
Ampersand Crashes Ruby
I''m using acts_as_ferret and when I call Object.find_by_contents("A & B"), Ruby dies with the following message: ^Cruby(5014,0xa000cf60) malloc: *** vm_allocate(size=1069056) failed (error code=3) ruby(5014,0xa000cf60) malloc: *** error: can''t allocate region ruby(5014,0xa000cf60) malloc: *** set a breakpoint in szone_error to debug ruby(5014,0xa000cf60) malloc:
2006 Apr 18
1
NoMemoryError
I am using the Openbase adapter and have had a similar glitch here and there, but after I go into production I consistently get an error on one page. ActionView::TemplateError (NoMemoryError: failed to allocate memory: SELECT * FROM ... I cannot track down the exact location of the error, but the production log says it was around:
2005 Nov 13
1
Memory allocation (PR#8304)
Full_Name: Hans Kestler Version: 2.2.0 OS: 10.4.3 Submission from: (NULL) (84.156.184.101) > sam1.out<-sam(raw1[,2:23],raw1.cl,B=0,rand=124) We're doing 319770 complete permutations Error: cannot allocate vector of size 575586 Kb R(572,0xa000ed68) malloc: *** vm_allocate(size=589402112) failed (error code=3) R(572,0xa000ed68) malloc: *** error: can't allocate region
2008 Jan 10
1
OS X binary: 32 or 64-bit?
Dear R Experts, I am using R.app (the Mac OS X binary) for neuroimage analysis, so I am loading in some large image files. I get the following error in the middle of my script: > source("3dLME.R") Read 1 record Read 1 record Read 1 record Read 1 record Read 1 record Error: cannot allocate vector of size 3.1 Gb R(2081,0xa000d000) malloc: *** vm_allocate(size=3321675776) failed (error
2008 Mar 21
1
Memory Problem
Dear all, I am having a memory problem when analyzing a rather large data set with nested factors in R. The model is of the form X~A*B*(C/D/F) A,B,C,D,F being the independent variables some of which are nested. The problem occurs when using aov but also when using glm or lme. In particular I get the following response, Error: cannot allocate vector of size 1.6 Gb R(311,0xa000d000) malloc: ***
2008 Apr 15
3
R memory issue for writing out the file
Hello, all, First thanks in advance for helping me. I am now handling a data frame, dimension 11095400 rows and 4 columns. It seems work perfect in my MAC R (Mac Pro, Intel Chip with 4G RAM) until I was trying to write this file out using the command: write.table(all,file="~/Desktop/alex.lgen",sep=" ",row.names=F,na="0",quote=F,col.names=F) I got the error
2006 Feb 01
2
memory limit in aov
I want to do an unbalanced anova on 272,992 observations with 405 factors including 2-way interactions between 1 of these factors and the other 404. After fitting only 11 factors and their interactions I get error messages like: Error: cannot allocate vector of size 1433066 Kb R(365,0xa000ed68) malloc: *** vm_allocate(size=1467461632) failed (error code=3) R(365,0xa000ed68) malloc: ***
2005 Jul 19
1
mac os x crashes with bioconductor microarray code (PR#8013)
Full_Name: Eric Libby Version: 2.1.1 OS: OS Tiger Submission from: (NULL) (65.93.158.117) I am trying to analyze microarray data of 42 human arrays. I typed in the following instructions: library(affy) Data <-ReadAffy() eset <- expresso(Data, normalize.method="invariantset", bg.correct=FALSE, pmcorrect.method="pmonly",summary.method="liwong") And I get some
2007 Nov 15
1
Problem with rsync recent file logic ?
Hello, I have 2 servers I'm synchronizing using rsync, I have a situation where I : 1. rsync from rnd-dev2 to rnd-dev1 2. change the rsynched file on rnd-dev1 3. rsync from rnd-dev2 to rnd-dev1 again 4. File gets overridden on rnd-dev1 over though it has newer change time then file on rnd-dev2. here is the bug(?) reproduction: [root@rnd-dev1 test_rsync]# rsync --version rsync version
2006 Aug 04
1
incorrect checksum for freed object?
I''m using ferret (0.9.4) in rails, but outside of the "acts_as_ferret" plugin. Whenever I use a QueryFilter (even a very simple one), the server will crash after one, two, or three reloads of a page (same page, same query, same filter). It''s very non-deterministic and I can''t seem to reproduce it outside of my application environment (I can''t get it
2012 Mar 30
4
[PATCH] virtio_blk: Drop unused request tracking list
Benchmark shows small performance improvement on fusion io device. Before: seq-read : io=1,024MB, bw=19,982KB/s, iops=39,964, runt= 52475msec seq-write: io=1,024MB, bw=20,321KB/s, iops=40,641, runt= 51601msec rnd-read : io=1,024MB, bw=15,404KB/s, iops=30,808, runt= 68070msec rnd-write: io=1,024MB, bw=14,776KB/s, iops=29,552, runt= 70963msec After: seq-read : io=1,024MB, bw=20,343KB/s,
2012 Mar 30
4
[PATCH] virtio_blk: Drop unused request tracking list
Benchmark shows small performance improvement on fusion io device. Before: seq-read : io=1,024MB, bw=19,982KB/s, iops=39,964, runt= 52475msec seq-write: io=1,024MB, bw=20,321KB/s, iops=40,641, runt= 51601msec rnd-read : io=1,024MB, bw=15,404KB/s, iops=30,808, runt= 68070msec rnd-write: io=1,024MB, bw=14,776KB/s, iops=29,552, runt= 70963msec After: seq-read : io=1,024MB, bw=20,343KB/s,
2007 Nov 01
1
Problem with compiling 64bit R(2.5.1) under HP-UX(ia64)
Hi there, We are trying to compile a 64bit version of R (2.5.1) on HP-UX (B.11.23 U ia64), but are running into some problems. This is our configure step: ../configure --prefix=/rnd/homes/lfan/R251 --enable-R-shlib CC="cc" CFLAGS="+z +DD64" CXX="aCC" CXXFLAGS="-b -lxnet +z +DD64" FC="f90" FCFLAGS="+DD64" F77="f90"
2003 Dec 15
2
help in lme
To anyone who can help, I have two stupid questions, and one fairly intelligent question Stupid question (1): is there an R function to calculate a factorial of a number? That is...is there a function g(.) such that g(3) = 6, g(4) = 24, g(6) = 720, etc? Stupid question (2): how do you extract the estimated covariance matrix of the random effects in an lme object? Intelligent question
2005 Jul 20
2
(no subject)
Hi All, I want to print a square matrix of 7000 x 7000 into a text file. But I got a error after few hours of computation... -------- > write.table(MyDistMxDF, file = "temp.csv", sep=",", quote=F) *** malloc: vm_allocate(size=8421376) failed (error code=3) *** malloc[2889]: error: Can't allocate region Error: vector memory exhausted (limit reached?) *** malloc:
2006 Mar 08
1
malloc: vm_allocate(size=381886464) failed (error code=3)
Hi all, I am having memory allocation problem with my R 2.2.1 for Mac OS. The following is the error message that I get. I do not get this message if I break down the large dataset in to sub datasets. I think breaking up the dataset is not a sustainable solution in the long run. The data that I am analysing is essentially big, and it would be reasonable to do the analyis on the whole dataset
2010 Jan 18
1
A question about build R-2.10.0 on HP-UX ia64 server.
Hi R usrs, I want to build R-2.10.0 on HP-UX, but I got following error message: ld: Unsatisfied symbol "zgemm" in file CHOLMOD.a[cholmod_l_super_numeric.o] ld: Unsatisfied symbol "zgemv" in file CHOLMOD.a[cholmod_l_super_solve.o] ld: Unsatisfied symbol "zherk" in file CHOLMOD.a[cholmod_l_super_numeric.o] ld: Unsatisfied symbol "ztrsm" in file
2010 Feb 22
2
Siegel-Tukey test for equal variability (code)
Hi, I recently ran into the problem that I needed a Siegel-Tukey test for equal variability based on ranks. Maybe there is a package that has it implemented, but I could not find it. So I programmed an R function to do it. The Siegel-Tukey test requires to recode the ranks so that they express variability rather than ascending order. This is essentially what the code further below does. After the
2009 Apr 24
2
Error building package: LaTeX error when creating PDF version
Hi all I am trying to build an R package, which I have successfully done many times before, but have an error I cannot trace. I hope someone can help me. Here's is some edited output (full output below if it is useful): pdunn2 at PDunnUbuntu:~/DSdata$ R CMD build GLMsData * checking for file 'GLMsData/DESCRIPTION' ... OK * preparing 'GLMsData': * checking DESCRIPTION
2010 Jan 11
1
Help with Order
Dear List As a fairly new R programmer I seem to have run into a strange problem - probably my inexperience with R After reading and merging successive files into a single data frame, I find that order does not sort the data as expected. I have multiple references in each file but each file refers to measurement data obtained at a different time. Here's the code library(reshape) #