Displaying 20 results from an estimated 800 matches similar to: "Memory Problem"
2006 Jun 06
1
Ampersand Crashes Ruby
I''m using acts_as_ferret and when I call Object.find_by_contents("A &
B"), Ruby dies with the following message:
^Cruby(5014,0xa000cf60) malloc: *** vm_allocate(size=1069056) failed
(error code=3)
ruby(5014,0xa000cf60) malloc: *** error: can''t allocate region
ruby(5014,0xa000cf60) malloc: *** set a breakpoint in szone_error to
debug
ruby(5014,0xa000cf60) malloc:
2005 Nov 13
1
Memory allocation (PR#8304)
Full_Name: Hans Kestler
Version: 2.2.0
OS: 10.4.3
Submission from: (NULL) (84.156.184.101)
> sam1.out<-sam(raw1[,2:23],raw1.cl,B=0,rand=124)
We're doing 319770 complete permutations
Error: cannot allocate vector of size 575586 Kb
R(572,0xa000ed68) malloc: *** vm_allocate(size=589402112) failed (error code=3)
R(572,0xa000ed68) malloc: *** error: can't allocate region
2008 Jan 10
1
OS X binary: 32 or 64-bit?
Dear R Experts,
I am using R.app (the Mac OS X binary) for neuroimage analysis, so I
am loading in some large image files. I get the following error in the
middle of my script:
> source("3dLME.R")
Read 1 record
Read 1 record
Read 1 record
Read 1 record
Read 1 record
Error: cannot allocate vector of size 3.1 Gb
R(2081,0xa000d000) malloc: *** vm_allocate(size=3321675776) failed
(error
2006 Jun 10
3
sparse matrix, rnorm, malloc
Hi,
I'm Sorry for any cross-posting. I've reviewed the archives and could
not find an exact answer to my question below.
I'm trying to generate very large sparse matrices (< 1% non-zero
entries per row). I have a sparse matrix function below which works
well until the row/col count exceeds 10,000. This is being run on a
machine with 32G memory:
sparse_matrix <-
2006 Apr 18
1
NoMemoryError
I am using the Openbase adapter and have had a similar glitch here
and there, but after I go into production I consistently get an error
on one page.
ActionView::TemplateError (NoMemoryError: failed to allocate memory:
SELECT * FROM ...
I cannot track down the exact location of the error, but the
production log says it was around:
2006 Feb 01
2
memory limit in aov
I want to do an unbalanced anova on 272,992 observations with 405
factors including 2-way interactions between 1 of these factors and
the other 404. After fitting only 11 factors and their interactions I
get error messages like:
Error: cannot allocate vector of size 1433066 Kb
R(365,0xa000ed68) malloc: *** vm_allocate(size=1467461632) failed
(error code=3)
R(365,0xa000ed68) malloc: ***
2008 Apr 15
3
R memory issue for writing out the file
Hello, all,
First thanks in advance for helping me.
I am now handling a data frame, dimension 11095400 rows and 4 columns. It
seems work perfect in my MAC R (Mac Pro, Intel Chip with 4G RAM) until I was
trying to write this file out using the command:
write.table(all,file="~/Desktop/alex.lgen",sep="
",row.names=F,na="0",quote=F,col.names=F)
I got the error
2005 Jul 19
1
mac os x crashes with bioconductor microarray code (PR#8013)
Full_Name: Eric Libby
Version: 2.1.1
OS: OS Tiger
Submission from: (NULL) (65.93.158.117)
I am trying to analyze microarray data of 42 human arrays. I typed in the
following instructions:
library(affy)
Data <-ReadAffy()
eset <- expresso(Data, normalize.method="invariantset", bg.correct=FALSE,
pmcorrect.method="pmonly",summary.method="liwong")
And I get some
2006 Aug 04
1
incorrect checksum for freed object?
I''m using ferret (0.9.4) in rails, but outside of the "acts_as_ferret"
plugin. Whenever I use a QueryFilter (even a very simple one), the server
will crash after one, two, or three reloads of a page (same page, same
query, same filter). It''s very non-deterministic and I can''t seem to
reproduce it outside of my application environment (I can''t get it
2008 Mar 25
2
R 64 on Intel Mac check problem
Dear all, I have been following the instructions on
http://cran.stat.sfu.ca/with the intention to install R with 64 bits
support on an 8 core Intel Mac
with 6 GB of memory. I am doing this so I can run an analysis that requires
1.6 gB of memory allocation and would not run on 32 bit version of R as I
was advised and experienced.
I have installed gcc 4.2 and gfortran 4.2 from the available sources
2006 Mar 08
1
malloc: vm_allocate(size=381886464) failed (error code=3)
Hi all,
I am having memory allocation problem with my R 2.2.1 for Mac OS. The
following is the error message that I get. I do not get this message if I
break down the large dataset in to sub datasets. I think breaking up the
dataset is not a sustainable solution in the long run. The data that I am
analysing is essentially big, and it would be reasonable to do the analyis
on the whole dataset
2005 Jul 20
2
(no subject)
Hi All,
I want to print a square matrix of 7000 x 7000 into a text file. But I
got a error after few hours of computation...
--------
> write.table(MyDistMxDF, file = "temp.csv", sep=",", quote=F)
*** malloc: vm_allocate(size=8421376) failed (error code=3)
*** malloc[2889]: error: Can't allocate region
Error: vector memory exhausted (limit reached?)
*** malloc:
2006 Nov 08
5
Mac x86: Diablo II Copy Protection
Hello,
ive installed successfull diablo 2 LoD within X11 and wine.
But now when i try to start lod it says, i need the cd to play.
CD is inserted, path in regsitry is exact, mounted it successfull.
In the console i get the following error:
err:aspi:ASPI_GetNumControllers Could not open HKLM\L"HARDWARE\
\DEVICEMAP\\Scsi"
wineserver(4712) malloc: *** vm_allocate(size=4286775296)
2011 Jan 11
1
Bonding performance question
I have a Dell server with four bonded, gigabit interfaces. Bonding mode is
802.3ad, xmit_hash_policy=layer3+4. When testing this setup with iperf,
I never get more than a total of about 3Gbps throughput. Is there anything
to tweak to get better throughput? Or am I running into other limits (e.g.
was reading about tcp retransmit limits for mode 0).
The iperf test was run with iperf -s on the
2007 Jul 10
0
[LLVMdev] Accounting for stack space
On Jul 10, 2007, at 15:39, Chris Lattner wrote:
> On Tue, 10 Jul 2007, Sandro Magi wrote:
>
>>> used. Your choices are to either override malloc/free for both
>>> the JIT and the program or for neither of them.
>>
>> I want to 'intercept' ALL allocations actually, including the
>> stack if possible, so the above suits me just fine.
>
>
2004 Feb 18
1
R would not quit (bug?) (PR#6600)
Hello.
I am not sure if this is a bug.
I loaded into R a very large matrix, around 600,000 X 30 integers. My
machine has 2G ram.
After trying to write.table a modification of this matrix, R generated
some sort of malloc error.
Then,
Error: vector memory exhausted (limit reached?)
> q()
Save workspace image? [y/n/c]: n
*** malloc: vm_allocate(size=8421376) failed (error code=3)
***
2005 May 09
1
bootstap and lme4
Hi,
I am trying to get bootstrap confidence intervals on variance
components and related statistics. To calculate the variance components
I use the package lme4.
> off.fun <- function(data, i){
d <- data[i,]
lme1<- lmer(y ~ trt + (trt-1|group), d)
VarCorr(lme1)@reSumry$group[2,1] #just as an example
}
> off.boot <- boot(data=data.sim, statistic=off.fun, R=100)
If
2005 Oct 27
2
Question on Quad Opteron HP DL 585
Hi,
I have a HP DL 585 server with Quad Opteron processors and 32 Gbytes of
RAM.
I would like to know whether CentOS 3.5 x86_64 will recognize all 4 CPUs
and 32 Gbytes of RAM.
I will also running EDA tools like PriteTime, and Magma on this server.
Does CentOS has any issues
with these tools?
your help on this question is appreciated.
Thanks
Siva.
-------------- next part --------------
An
2005 Feb 28
1
memory problem with mac os X
Dear list,
I am using R.2.0.1 on a G5 biprocessor 2.5GHz with 2Go RAM (Mac OS X
10.3.8).
I'm trying to calculate an object of type "dist". I am getting the
following memory error :
*** malloc: vm_allocate(size=1295929344) failed (error code=3)
*** malloc[25960]: error: Can't allocate region
Error: cannot allocate vector of size 1265554 Kb
When I do a top on the terminal, I
2010 Aug 09
0
[LLVMdev] MmapAllocator
On Sun, Aug 8, 2010 at 9:20 PM, Reid Kleckner <reid.kleckner at gmail.com>wrote:
> On Sun, Aug 8, 2010 at 8:20 PM, Jakob Stoklund Olesen <stoklund at 2pi.dk>
> wrote:
> >
> > On Aug 7, 2010, at 7:05 PM, Steven Noonan wrote:
> >> I've been doing work on memory reduction in Unladen Swallow, and
> >> during testing, LiveRanges seemed to be