Hi I'm computing a bca interval using bca.ci from the boot package. When I try to use this I get an error> library(boot) > boot(logglm.data,boot.fishpower,2500,coef.vec=coeflm.vec)->blm8901 > bca.ci(blm8901,index=29)Error: cannot allocate vector of size 456729 Kb However my machine has 2GB of memory and without R running I only have 112M of memory used. Is there something I can do to be able to perform this analysis ? (I can not by more memory ;-) Thanks EJ -- Ernesto Jardim <ernesto at ipimar.pt> Marine Biologist IPIMAR - National Research Institute for Agriculture and Fisheries Av. Brasilia, 1400-006 Lisboa, Portugal Tel: +351 213 027 000 Fax: +351 213 015 948 http://ernesto.freezope.org
Hi I'm using SuSE 8.0 and R 1.6.2. The mem.limits are nt set so it should go to the maximum the machine allows. My doubt is that I have 2GB and R is complainig about allocating less then 500MB. Regards EJ On Fri, 2003-01-24 at 22:22, Andrew C. Ward wrote:> Ernesto, > > I can't tell what version of R you're using and for which platform. > In any case, there are some start-up options relating to memory > usage, and you will find discussions of these in the relevant > FAQ. Under Windows, the amount of memory that R uses is set by the > command-line flag "--max-mem-size". > > An alternative is to perform your analysis on just a few random > subsets of data and then aggregate the results. I don't know how > big your data set actually is so it's hard to provide more > specific guidance. > > Post again if you're still having trouble. > > > > Regards, > > Andrew C. Ward > > CAPE Centre > Department of Chemical Engineering > The University of Queensland > Brisbane Qld 4072 Australia > andreww at cheque.uq.edu.au > > > > On Friday, January 24, 2003 10:02 PM, Ernesto Jardim [SMTP:ernesto at ipimar.pt] wrote: > > Hi > > > > I'm computing a bca interval using bca.ci from the boot package. > > > > When I try to use this I get an error > > > > > > > library(boot) > > > boot(logglm.data,boot.fishpower,2500,coef.vec=coeflm.vec)->blm8901 > > > bca.ci(blm8901,index=29) > > Error: cannot allocate vector of size 456729 Kb > > > > However my machine has 2GB of memory and without R running I only have > > 112M of memory used. > > > > Is there something I can do to be able to perform this analysis ? (I can > > not by more memory ;-) > > > > Thanks > > > > EJ > > > > -- > > Ernesto Jardim <ernesto at ipimar.pt> > > Marine Biologist > > IPIMAR - National Research Institute for Agriculture and Fisheries > > Av. Brasilia, 1400-006 > > Lisboa, Portugal > > Tel: +351 213 027 000 > > Fax: +351 213 015 948 > > http://ernesto.freezope.org > > > > ______________________________________________ > > R-help at stat.math.ethz.ch mailing list > > http://www.stat.math.ethz.ch/mailman/listinfo/r-help-- Ernesto Jardim <ernesto at ipimar.pt> Marine Biologist Research Institute for Agriculture and Fisheries Lisboa, Portugal Tel: +351 213 027 000 Fax: +351 213 015 948
On 24 Jan 2003, Ernesto Jardim wrote:> Hi > > I'm computing a bca interval using bca.ci from the boot package. > > When I try to use this I get an error > > > > library(boot) > > boot(logglm.data,boot.fishpower,2500,coef.vec=coeflm.vec)->blm8901 > > bca.ci(blm8901,index=29) > Error: cannot allocate vector of size 456729 Kb > > However my machine has 2GB of memory and without R running I only have > 112M of memory used.How much memory is it actually using? It is complaining about allocating an *additional* 450Mb. Look at top / Task Manager / whatever.> Is there something I can do to be able to perform this analysis ? (I can > not by more memory ;-)Why are you returning so many results (apparently) when you only want one index? Try returning just one? -- Brian D. Ripley, ripley at stats.ox.ac.uk Professor of Applied Statistics, http://www.stats.ox.ac.uk/~ripley/ University of Oxford, Tel: +44 1865 272861 (self) 1 South Parks Road, +44 1865 272866 (PA) Oxford OX1 3TG, UK Fax: +44 1865 272595
Hi all New to R, and to this list, so this may be an old (and hopefully simple to solve!) problem. When running an aov analysis, after fitting two-way interactions, I get an error message saying 'can not allocate vector of 8515Kb'. Often other programs are then kicked out of memory or the whole computer crashes. I've tried bumping up memory by adding max-mem-size, max-vsize, min-vsize, max-nsize, min-nsize, etc commands (with parameter values up to 1G) to the start-up line, but that either doesn't work or makes matters worse (i.e analysis crashes earlier). Our computer support officer here has looked at the problem and he does not believe it has to do with the actual amount of physical RAM of my machine, but more with the way R handles memory. I run R under Windows NT, by the way. Any suggestions as to what I'm doing wrong? Thanks for any help! Lex ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ Dr Alex R. Kraaijeveld NERC Centre for Population Biology Imperial College London, Silwood Park Campus Ascot, Berkshire SL5 7PY England, UK tel: +44-(0)20-75942544 fax: +44-(0)1344-873173 e-mail: a.kraayeveld at imperial.ac.uk http://www.cpb.bio.ic.ac.uk/staff/kraaijeveld/lkraaijeveld.html Cloonaughill Celtic Malts http://www.celticmalts.com/edge.htm http://www.celticmalts.com/journal.htm ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
Hi, well, it is always a good bet to start with a small subset of your data. Increase it and take a look what happens and how much memory it takes (object.size() should give the size of the oject in memory, use OS-tools to check the size of the whole process). There may also be some OS-related issues. There are some issues with memory management under Windows, in particular the memory may become fragmented. Try the same under UNIX if you have access to. And last -- please describe more precisely what exactly are you doing and what does your data look like. Best wishes, Ott