Hi dear R-listers, I'm trying to fit a 3-level model using lme in R. My sample size is about 2965 and 3 factors: year (5 levels), ssize (4 levels), condition (2 levels). When I issue the following command:>lme(var~year*ssize*condition,random=~ssize+condition|subject,data=smp,method ="ML") I got the following error: Error in logLik.lmeStructInt(lmeSt, lmePars) : Calloc could not allocate (65230 of 8) memory In addition: Warning message: Reached total allocation of 120Mb: see help(memory.size) I'm currently using a Win2000 machine with 128Mb RAM and a 1.2 Gb processor. My version of R is 1.7.1. Thanks in advance, Rodrigo Abt. Department of Economic and Tributary Studies, SII, Chile.
The error says you don't have enough memory on your computer. Unfortunately, the only solution may be to buy more. -roger Rodrigo Abt wrote:>Hi dear R-listers, I'm trying to fit a 3-level model using lme in R. My >sample size is about 2965 and 3 factors: > >year (5 levels), ssize (4 levels), condition (2 levels). > >When I issue the following command: > > > >lme(var~year*ssize*condition,random=~ssize+condition|subject,data=smp,method >="ML") > >I got the following error: > >Error in logLik.lmeStructInt(lmeSt, lmePars) : > Calloc could not allocate (65230 of 8) memory >In addition: Warning message: >Reached total allocation of 120Mb: see help(memory.size) > >I'm currently using a Win2000 machine with 128Mb RAM and a 1.2 Gb processor. >My version of R is 1.7.1. > >Thanks in advance, > >Rodrigo Abt. >Department of Economic and Tributary Studies, >SII, Chile. > >______________________________________________ >R-help at stat.math.ethz.ch mailing list >https://www.stat.math.ethz.ch/mailman/listinfo/r-help > > >
Have you done what the message said? On Mon, 10 Nov 2003, Rodrigo Abt wrote:> Hi dear R-listers, I'm trying to fit a 3-level model using lme in R. My > sample size is about 2965 and 3 factors: > > year (5 levels), ssize (4 levels), condition (2 levels). > > When I issue the following command: > > > > lme(var~year*ssize*condition,random=~ssize+condition|subject,data=smp,method > ="ML") > > I got the following error: > > Error in logLik.lmeStructInt(lmeSt, lmePars) : > Calloc could not allocate (65230 of 8) memory > In addition: Warning message: > Reached total allocation of 120Mb: see help(memory.size) > > I'm currently using a Win2000 machine with 128Mb RAM and a 1.2 Gb processor. > My version of R is 1.7.1.You probably need more memory, but you could try following the advice in the help page pointed to. If you increase the memory allocation R will continue to run, albeit slowly. -- Brian D. Ripley, ripley at stats.ox.ac.uk Professor of Applied Statistics, http://www.stats.ox.ac.uk/~ripley/ University of Oxford, Tel: +44 1865 272861 (self) 1 South Parks Road, +44 1865 272866 (PA) Oxford OX1 3TG, UK Fax: +44 1865 272595
On 10 Nov 2003 at 13:01, Rodrigo Abt wrote: See?Memory for how you can get R to use virtual memory. Kjetil Halvorsen> Hi dear R-listers, I'm trying to fit a 3-level model using lme in R. My > sample size is about 2965 and 3 factors: > > year (5 levels), ssize (4 levels), condition (2 levels). > > When I issue the following command: > > > > lme(var~year*ssize*condition,random=~ssize+condition|subject,data=smp,method > ="ML") > > I got the following error: > > Error in logLik.lmeStructInt(lmeSt, lmePars) : > Calloc could not allocate (65230 of 8) memory > In addition: Warning message: > Reached total allocation of 120Mb: see help(memory.size) > > I'm currently using a Win2000 machine with 128Mb RAM and a 1.2 Gb processor. > My version of R is 1.7.1. > > Thanks in advance, > > Rodrigo Abt. > Department of Economic and Tributary Studies, > SII, Chile. > > ______________________________________________ > R-help at stat.math.ethz.ch mailing list > https://www.stat.math.ethz.ch/mailman/listinfo/r-help
How much processing takes place before you get to the lme call? Maybe R has just used up the memory on something else. I think there is a fair amount of memory leak, as I get similar problems with my program. I use R 1.8.0. My program goes as follows. 1. Use RODBC to get a data.frame containing assays to analyze (17 assays are found). 2. Define an AnalyzeAssay(assay, suffix) function to do the following: a) Use RODBC to get data. b) Store dataset "limsdata" in workspace using the <<- operator to avoid the following error in qqnorm.lme: Error in eval(expr, envir, enclos) : Object "limsdata" not found, when I call it with a grouping formula like: ~ resid(.) | ORDCURV. c) Call lme to analyze data. d) Produce some diagnostic plots. Record them by setting record=TRUE on the trellis.device e) Save the plots on win.metafile using replayPlot(...) f) Save text to a file using sink(...) 3. Call the function for each assay using the code: # Analyze each assay for(i in 1:length(assays[,1])) { writeLines(paste("Analyzing ", assays$DILUTION[i], " ", assays$PROFNO[i], "...", sep="")) flush.console() AnalyzeAssay(assays$DILUTION[i], assays$PROFNO[i]) # Clean up memory rm(limsdata) gc() } As you can see, I try to remove the dataset stored in workspace and then call gc() to clean up my memory as I go. Nevertheless, when I come to assay 11 out of 17, it stops with a memory allocation error. I have to quit R, and start again with assay 11, then it stops again with assay 15 and finally 17. The last assays have much more data than the first ones, but all assays can be completed as long as I keep restarting... Maybe restarting the job can help you getting it done? Cheers, Jesper -----Original Message----- From: Rodrigo Abt [mailto:rodrigo.abt at sii.cl] Sent: Monday, November 10, 2003 11:02 AM To: r-help at stat.math.ethz.ch Subject: [R] Memory issues.. Hi dear R-listers, I'm trying to fit a 3-level model using lme in R. My sample size is about 2965 and 3 factors: year (5 levels), ssize (4 levels), condition (2 levels). When I issue the following command:>lme(var~year*ssize*condition,random=~ssize+condition|subject,data=smp,me thod ="ML") I got the following error: Error in logLik.lmeStructInt(lmeSt, lmePars) : Calloc could not allocate (65230 of 8) memory In addition: Warning message: Reached total allocation of 120Mb: see help(memory.size) I'm currently using a Win2000 machine with 128Mb RAM and a 1.2 Gb processor. My version of R is 1.7.1. Thanks in advance, Rodrigo Abt. Department of Economic and Tributary Studies, SII, Chile.
I am using Windows 2000. Kind regards, Jesper Frickmann Statistician, Quality Control Novozymes North America Inc. Tel. +1 919 494 3266 Fax +1 919 494 3460 -----Original Message----- From: Thomas W Blackwell [mailto:tblackw at umich.edu] Sent: Wednesday, November 12, 2003 10:43 AM To: JFRI (Jesper Frickman) Cc: rodrigo.abt at sii.cl; jmacdon at umich.edu; r-help at stat.math.ethz.ch Subject: RE: [R] Memory issues.. Jesper - (off-list) Jim MacDonald reports seeing different memory-management behavior between Windows and Linux operating systems on the same, dual boot machine. Unfortunately, this is happening at the operating system level, so the R code cannot do anything about it. I have cc'ed Jim on this email, hoping that he will give more details to the entire list. What operating systems (and versions of R) do you think Rodrigo and Jesper are using ? Specifically for Jesper's AnalyzeAssay() function: There is some manipulation you can do using formula() or as.formula() that will assign a local object as the environment in which to find values for the terms in a formula. (I've never done this, so I can't give you an example of working code, only references to the help pages for "formula" and "environment". It's often very instructive to literally type in the sequence of statements given as examples at the bottom of each help page.) I think this will allow you to avoid assigning to the global workspace. Are you sure that the call to rm() below is actually removing the copy of limsdata that's in .GlobalEnv, rather than a local copy ? I would expect you to have to specify where=1 in order to get the behavior you want. - tom blackwell - u michigan medical school - ann arbor - On Wed, 12 Nov 2003, JFRI (Jesper Frickman) wrote:> How much processing takes place before you get to the lme call? Maybe > R has just used up the memory on something else. I think there is a > fair amount of memory leak, as I get similar problems with my program.> I use R 1.8.0. My program goes as follows. > > 1. Use RODBC to get a data.frame containing assays to analyze (17 > assays are found). 2. Define an AnalyzeAssay(assay, suffix) function > to do the following: > a) Use RODBC to get data. > b) Store dataset "limsdata" in workspace using the <<- operatorto> avoid the following error in qqnorm.lme: Error in eval(expr, envir, > enclos) : Object "limsdata" not found, when I call it with a grouping > formula like: ~ resid(.) | ORDCURV. > c) Call lme to analyze data. > d) Produce some diagnostic plots. Record them by settingrecord=TRUE> on the trellis.device > e) Save the plots on win.metafile using replayPlot(...) > f) Save text to a file using sink(...) > > 3. Call the function for each assay using the code: > > # Analyze each assay > for(i in 1:length(assays[,1])) > { > writeLines(paste("Analyzing ", assays$DILUTION[i], " ", > assays$PROFNO[i], "...", sep="")) > flush.console() > AnalyzeAssay(assays$DILUTION[i], assays$PROFNO[i]) > > # Clean up memory > rm(limsdata) > gc() > } > > As you can see, I try to remove the dataset stored in workspace and > then call gc() to clean up my memory as I go. > > Nevertheless, when I come to assay 11 out of 17, it stops with a > memory allocation error. I have to quit R, and start again with assay > 11, then it stops again with assay 15 and finally 17. The last assays > have much more data than the first ones, but all assays can be > completed as long as I keep restarting... > > Maybe restarting the job can help you getting it done? > > Cheers, > Jesper > > -----Original Message----- > From: Rodrigo Abt [mailto:rodrigo.abt at sii.cl] > Sent: Monday, November 10, 2003 11:02 AM > To: r-help at stat.math.ethz.ch > Subject: [R] Memory issues.. > > > Hi dear R-listers, I'm trying to fit a 3-level model using lme in R. > My sample size is about 2965 and 3 factors: > > year (5 levels), ssize (4 levels), condition (2 levels). > > When I issue the following command: > > > > lme(var~year*ssize*condition,random=~ssize+condition|subject,data=smp, > me > thod > ="ML") > > I got the following error: > > Error in logLik.lmeStructInt(lmeSt, lmePars) : > Calloc could not allocate (65230 of 8) memory > In addition: Warning message: > Reached total allocation of 120Mb: see help(memory.size) > > I'm currently using a Win2000 machine with 128Mb RAM and a 1.2 Gb > processor. My version of R is 1.7.1. > > Thanks in advance, > > Rodrigo Abt. > Department of Economic and Tributary Studies, > SII, Chile. > > ______________________________________________ > R-help at stat.math.ethz.ch mailing list > https://www.stat.math.ethz.ch/mailman/listinfo/r-help >
I have just tried listing limsdata from the workspace and it is indeed gone from .GlobalEnv. I also tried passing the environment to the as.formula function, but it still doesn't work. Kind regards, Jesper Frickmann Statistician, Quality Control Novozymes North America Inc. Tel. +1 919 494 3266 Fax +1 919 494 3460 -----Original Message----- From: Thomas W Blackwell [mailto:tblackw at umich.edu] Sent: Wednesday, November 12, 2003 10:43 AM To: JFRI (Jesper Frickman) Cc: rodrigo.abt at sii.cl; jmacdon at umich.edu; r-help at stat.math.ethz.ch Subject: RE: [R] Memory issues.. Jesper - (off-list) Jim MacDonald reports seeing different memory-management behavior between Windows and Linux operating systems on the same, dual boot machine. Unfortunately, this is happening at the operating system level, so the R code cannot do anything about it. I have cc'ed Jim on this email, hoping that he will give more details to the entire list. What operating systems (and versions of R) do you think Rodrigo and Jesper are using ? Specifically for Jesper's AnalyzeAssay() function: There is some manipulation you can do using formula() or as.formula() that will assign a local object as the environment in which to find values for the terms in a formula. (I've never done this, so I can't give you an example of working code, only references to the help pages for "formula" and "environment". It's often very instructive to literally type in the sequence of statements given as examples at the bottom of each help page.) I think this will allow you to avoid assigning to the global workspace. Are you sure that the call to rm() below is actually removing the copy of limsdata that's in .GlobalEnv, rather than a local copy ? I would expect you to have to specify where=1 in order to get the behavior you want. - tom blackwell - u michigan medical school - ann arbor - On Wed, 12 Nov 2003, JFRI (Jesper Frickman) wrote:> How much processing takes place before you get to the lme call? Maybe > R has just used up the memory on something else. I think there is a > fair amount of memory leak, as I get similar problems with my program.> I use R 1.8.0. My program goes as follows. > > 1. Use RODBC to get a data.frame containing assays to analyze (17 > assays are found). 2. Define an AnalyzeAssay(assay, suffix) function > to do the following: > a) Use RODBC to get data. > b) Store dataset "limsdata" in workspace using the <<- operatorto> avoid the following error in qqnorm.lme: Error in eval(expr, envir, > enclos) : Object "limsdata" not found, when I call it with a grouping > formula like: ~ resid(.) | ORDCURV. > c) Call lme to analyze data. > d) Produce some diagnostic plots. Record them by settingrecord=TRUE> on the trellis.device > e) Save the plots on win.metafile using replayPlot(...) > f) Save text to a file using sink(...) > > 3. Call the function for each assay using the code: > > # Analyze each assay > for(i in 1:length(assays[,1])) > { > writeLines(paste("Analyzing ", assays$DILUTION[i], " ", > assays$PROFNO[i], "...", sep="")) > flush.console() > AnalyzeAssay(assays$DILUTION[i], assays$PROFNO[i]) > > # Clean up memory > rm(limsdata) > gc() > } > > As you can see, I try to remove the dataset stored in workspace and > then call gc() to clean up my memory as I go. > > Nevertheless, when I come to assay 11 out of 17, it stops with a > memory allocation error. I have to quit R, and start again with assay > 11, then it stops again with assay 15 and finally 17. The last assays > have much more data than the first ones, but all assays can be > completed as long as I keep restarting... > > Maybe restarting the job can help you getting it done? > > Cheers, > Jesper > > -----Original Message----- > From: Rodrigo Abt [mailto:rodrigo.abt at sii.cl] > Sent: Monday, November 10, 2003 11:02 AM > To: r-help at stat.math.ethz.ch > Subject: [R] Memory issues.. > > > Hi dear R-listers, I'm trying to fit a 3-level model using lme in R. > My sample size is about 2965 and 3 factors: > > year (5 levels), ssize (4 levels), condition (2 levels). > > When I issue the following command: > > > > lme(var~year*ssize*condition,random=~ssize+condition|subject,data=smp, > me > thod > ="ML") > > I got the following error: > > Error in logLik.lmeStructInt(lmeSt, lmePars) : > Calloc could not allocate (65230 of 8) memory > In addition: Warning message: > Reached total allocation of 120Mb: see help(memory.size) > > I'm currently using a Win2000 machine with 128Mb RAM and a 1.2 Gb > processor. My version of R is 1.7.1. > > Thanks in advance, > > Rodrigo Abt. > Department of Economic and Tributary Studies, > SII, Chile. > > ______________________________________________ > R-help at stat.math.ethz.ch mailing list > https://www.stat.math.ethz.ch/mailman/listinfo/r-help >
I tried first to increase --min-vsize to 2G (which I assume means as much of the 512M RAM available on my system as possible). The idea was to allocate all the heap memory in one huge chunk to avoid fragmentation. It actually brought the number of assays completed up from 11 to 13 before it stopped with the usual error. Then I increased --max-memory-size to 2G, and when I came in this morning it was still running. However, it would probably take days instead of hours to complete the last couple of assays! So it is easier to restart a couple of times... Do you think that running R on Linux would fix the problem? I use Linux on my private home PC, and I might get a permission to try it out on the company network... If I have a good reason to do so! Cheers, Jesper -----Original Message----- From: Prof Brian Ripley [mailto:ripley at stats.ox.ac.uk] Sent: Wednesday, November 12, 2003 10:55 AM To: JFRI (Jesper Frickman) Cc: r-help at stat.math.ethz.ch Subject: RE: [R] Memory issues.. On Wed, 12 Nov 2003, JFRI (Jesper Frickman) wrote:> How much processing takes place before you get to the lme call? Maybe > R has just used up the memory on something else. I think there is a > fair amount of memory leak, as I get similar problems with my program.> I useWindows, right? I don't think this is memory leak, but rather fragmentation. Hopefully the memory management in R-devel will ease this, and you might like to compile that up and try it. On R 1.8.0 on Windows you have to be able to find a block of contiguous memory of the needed size, so fragmentation can kill you. Try increasing --max-memory-size unless you are near 2Gb.> R 1.8.0. My program goes as follows. > > 1. Use RODBC to get a data.frame containing assays to analyze (17 > assays are found). 2. Define an AnalyzeAssay(assay, suffix) function > to do the following: > a) Use RODBC to get data. > b) Store dataset "limsdata" in workspace using the <<- operatorto> avoid the following error in qqnorm.lme: Error in eval(expr, envir, > enclos) : Object "limsdata" not found, when I call it with a grouping > formula like: ~ resid(.) | ORDCURV. > c) Call lme to analyze data. > d) Produce some diagnostic plots. Record them by settingrecord=TRUE> on the trellis.device > e) Save the plots on win.metafile using replayPlot(...) > f) Save text to a file using sink(...) > > 3. Call the function for each assay using the code: > > # Analyze each assay > for(i in 1:length(assays[,1])) > { > writeLines(paste("Analyzing ", assays$DILUTION[i], " ", > assays$PROFNO[i], "...", sep="")) > flush.console() > AnalyzeAssay(assays$DILUTION[i], assays$PROFNO[i]) > > # Clean up memory > rm(limsdata) > gc() > } > > As you can see, I try to remove the dataset stored in workspace and > then call gc() to clean up my memory as I go. > > Nevertheless, when I come to assay 11 out of 17, it stops with a > memory allocation error. I have to quit R, and start again with assay > 11, then it stops again with assay 15 and finally 17. The last assays > have much more data than the first ones, but all assays can be > completed as long as I keep restarting... > > Maybe restarting the job can help you getting it done?-- Brian D. Ripley, ripley at stats.ox.ac.uk Professor of Applied Statistics, http://www.stats.ox.ac.uk/~ripley/ University of Oxford, Tel: +44 1865 272861 (self) 1 South Parks Road, +44 1865 272866 (PA) Oxford OX1 3TG, UK Fax: +44 1865 272595
I just tried out the 1.8.1 beta build, and it works! It ran through all 17 assays without a any problems on Windows 2000. Thanks to the R development team, they did a great job! Kind regards, Jesper Frickmann Statistician, Quality Control Novozymes North America Inc. Tel. +1 919 494 3266 Fax +1 919 494 3460 -----Original Message----- From: James MacDonald [mailto:jmacdon at med.umich.edu] Sent: Wednesday, November 12, 2003 1:09 PM To: JFRI (Jesper Frickman); rodrigo.abt at sii.cl; tblackw at umich.edu Cc: jmacdon at umich.edu Subject: RE: [R] Memory issues.. There was a discussion about memory allocation on the R-devel list this summer, and apparently somebody has done something about it in R-1.8.1 (according to BDR's earlier post). If you can compile yourself on windows, you could check it out yourself. Original post http://maths.newcastle.edu.au/~rking/R/devel/03b/0432.html BDR's reply http://maths.newcastle.edu.au/~rking/R/devel/03b/0433.html BDR's recent comment "Hopefully the memory management in R-devel will ease this, and you might like to compile that up and try it." HTH, Jim James W. MacDonald Affymetrix and cDNA Microarray Core University of Michigan Cancer Center 1500 E. Medical Center Drive 7410 CCGC Ann Arbor MI 48109 734-647-5623>>> Rodrigo Abt <rodrigo.abt at sii.cl> 11/12/03 12:08PM >>>I started R with --max-mem-size=300M and it "seems" to work better (at least it doesn't hang up my machine), but I don't have any results yet. P.S.: Are there any differences in memory management from 1.7.x to 1.8.0 ? Greetings, Rodrigo Abt B., Statistical Analyst, Department of Economic and Tributary Studies, Studies Subdivision, SII, Chile. -----Mensaje original----- De: Thomas W Blackwell [mailto:tblackw at umich.edu] Enviado el: Miercoles, 12 de Noviembre de 2003 12:43 Para: JFRI (Jesper Frickman) CC: rodrigo.abt at sii.cl; jmacdon at umich.edu; r-help at stat.math.ethz.ch Asunto: RE: [R] Memory issues.. Jesper - (off-list) Jim MacDonald reports seeing different memory-management behavior between Windows and Linux operating systems on the same, dual boot machine. Unfortunately, this is happening at the operating system level, so the R code cannot do anything about it. I have cc'ed Jim on this email, hoping that he will give more details to the entire list. What operating systems (and versions of R) do you think Rodrigo and Jesper are using ? Specifically for Jesper's AnalyzeAssay() function: There is some manipulation you can do using formula() or as.formula() that will assign a local object as the environment in which to find values for the terms in a formula. (I've never done this, so I can't give you an example of working code, only references to the help pages for "formula" and "environment". It's often very instructive to literally type in the sequence of statements given as examples at the bottom of each help page.) I think this will allow you to avoid assigning to the global workspace. Are you sure that the call to rm() below is actually removing the copy of limsdata that's in .GlobalEnv, rather than a local copy ? I would expect you to have to specify where=1 in order to get the behavior you want. - tom blackwell - u michigan medical school - ann arbor - On Wed, 12 Nov 2003, JFRI (Jesper Frickman) wrote:> How much processing takes place before you get to the lme call? MaybeR> has just used up the memory on something else. I think there is afair> amount of memory leak, as I get similar problems with my program. Iuse> R 1.8.0. My program goes as follows. > > 1. Use RODBC to get a data.frame containing assays to analyze (17assays> are found). > 2. Define an AnalyzeAssay(assay, suffix) function to do thefollowing:> a) Use RODBC to get data. > b) Store dataset "limsdata" in workspace using the <<- operatorto> avoid the following error in qqnorm.lme: Error in eval(expr,envir,> enclos) : Object "limsdata" not found, when I call it with agrouping> formula like: ~ resid(.) | ORDCURV. > c) Call lme to analyze data. > d) Produce some diagnostic plots. Record them by settingrecord=TRUE> on the trellis.device > e) Save the plots on win.metafile using replayPlot(...) > f) Save text to a file using sink(...) > > 3. Call the function for each assay using the code: > > # Analyze each assay > for(i in 1:length(assays[,1])) > { > writeLines(paste("Analyzing ", assays$DILUTION[i], " ", > assays$PROFNO[i], "...", sep="")) > flush.console() > AnalyzeAssay(assays$DILUTION[i], assays$PROFNO[i]) > > # Clean up memory > rm(limsdata) > gc() > } > > As you can see, I try to remove the dataset stored in workspace andthen> call gc() to clean up my memory as I go. > > Nevertheless, when I come to assay 11 out of 17, it stops with amemory> allocation error. I have to quit R, and start again with assay 11,then> it stops again with assay 15 and finally 17. The last assays havemuch> more data than the first ones, but all assays can be completed aslong> as I keep restarting... > > Maybe restarting the job can help you getting it done? > > Cheers, > Jesper > > -----Original Message----- > From: Rodrigo Abt [mailto:rodrigo.abt at sii.cl] > Sent: Monday, November 10, 2003 11:02 AM > To: r-help at stat.math.ethz.ch > Subject: [R] Memory issues.. > > > Hi dear R-listers, I'm trying to fit a 3-level model using lme in R.My> sample size is about 2965 and 3 factors: > > year (5 levels), ssize (4 levels), condition (2 levels). > > When I issue the following command: > > > >lme(var~year*ssize*condition,random=~ssize+condition|subject,data=smp,me> thod > ="ML") > > I got the following error: > > Error in logLik.lmeStructInt(lmeSt, lmePars) : > Calloc could not allocate (65230 of 8) memory > In addition: Warning message: > Reached total allocation of 120Mb: see help(memory.size) > > I'm currently using a Win2000 machine with 128Mb RAM and a 1.2 Gb > processor. My version of R is 1.7.1. > > Thanks in advance, > > Rodrigo Abt. > Department of Economic and Tributary Studies, > SII, Chile. > > ______________________________________________ > R-help at stat.math.ethz.ch mailing list > https://www.stat.math.ethz.ch/mailman/listinfo/r-help >
Reasonably Related Threads
- How to estimate the residual SD for each sample separately in mixed-effects model?
- concordance correlation coefficient using R
- How to use `[` without evaluating the arguments.
- Can someone recommend a package for SNP cluster analysis of Fluidigm microarrays?
- Multivariate binary response analysis