Hi all, I am getting the following error message:> mymodel = glm(response ~ . , family=binomial, data=C);Error: cannot allocate vector of size 734.2 Mb In addition: Warning messages: 1: In array(0, c(n, n), list(levs, levs)) : Reached total allocation of 1535Mb: see help(memory.size) 2: In array(0, c(n, n), list(levs, levs)) : Reached total allocation of 1535Mb: see help(memory.size) 3: In array(0, c(n, n), list(levs, levs)) : Reached total allocation of 1535Mb: see help(memory.size) 4: In array(0, c(n, n), list(levs, levs)) : Reached total allocation of 1535Mb: see help(memory.size) ----------- The data frame is 60000 x 20, is it too large for R? What shall I do? Will close all other softwares/applications help? My PC is Vista with 4GB memory. Thank you.
On Jun 14, 2009, at 9:06 PM, Michael wrote:> Hi all, > > I am getting the following error message: > >> mymodel = glm(response ~ . , family=binomial, data=C); > Error: cannot allocate vector of size 734.2 Mb > In addition: Warning messages: > 1: In array(0, c(n, n), list(levs, levs)) : > Reached total allocation of 1535Mb: see help(memory.size) > 2: In array(0, c(n, n), list(levs, levs)) : > Reached total allocation of 1535Mb: see help(memory.size) > 3: In array(0, c(n, n), list(levs, levs)) : > Reached total allocation of 1535Mb: see help(memory.size) > 4: In array(0, c(n, n), list(levs, levs)) : > Reached total allocation of 1535Mb: see help(memory.size) > > ----------- > > The data frame is 60000 x 20, > > is it too large for R? > > What shall I do? Will close all other softwares/applications help? My > PC is Vista with 4GB memory. Thank you.It's certainly not too large for R. Have you looked at the R Windows FAQ on the topic? http://cran.r-project.org/bin/windows/base/rw-FAQ.html#There-seems-to-be-a-limit-on-the-memory-it-uses_0021 ... and perhaps: http://finzi.psych.upenn.edu/Rhelp08/2008-August/171649.html David Winsemius, MD Heritage Laboratories West Hartford, CT
I have to use logistic regression... On Sun, Jun 14, 2009 at 8:04 PM, Frank E Harrell Jr<f.harrell at vanderbilt.edu> wrote:> Also it would be useful to compare glm with the lrm function in the Design > package, for speed and memory use. > > Frank > > > David Winsemius wrote: >> >> On Jun 14, 2009, at 9:06 PM, Michael wrote: >> >>> Hi all, >>> >>> I am getting the following error message: >>> >>>> mymodel = glm(response ~ . , family=binomial, data=C); >>> >>> Error: cannot allocate vector of size 734.2 Mb >>> In addition: Warning messages: >>> 1: In array(0, c(n, n), list(levs, levs)) : >>> ?Reached total allocation of 1535Mb: see help(memory.size) >>> 2: In array(0, c(n, n), list(levs, levs)) : >>> ?Reached total allocation of 1535Mb: see help(memory.size) >>> 3: In array(0, c(n, n), list(levs, levs)) : >>> ?Reached total allocation of 1535Mb: see help(memory.size) >>> 4: In array(0, c(n, n), list(levs, levs)) : >>> ?Reached total allocation of 1535Mb: see help(memory.size) >>> >>> ----------- >>> >>> The data frame is 60000 x 20, >>> >>> is it too large for R? >>> >>> What shall I do? Will close all other softwares/applications help? My >>> PC is Vista with 4GB memory. Thank you. >> >> It's certainly not too large for R. Have you looked at the R Windows FAQ >> on the topic? >> >> >> http://cran.r-project.org/bin/windows/base/rw-FAQ.html#There-seems-to-be-a-limit-on-the-memory-it-uses_0021 >> >> ... and perhaps: >> >> http://finzi.psych.upenn.edu/Rhelp08/2008-August/171649.html >> >> >> David Winsemius, MD >> Heritage Laboratories >> West Hartford, CT >> >> ______________________________________________ >> R-help at r-project.org mailing list >> https://stat.ethz.ch/mailman/listinfo/r-help >> PLEASE do read the posting guide >> http://www.R-project.org/posting-guide.html >> and provide commented, minimal, self-contained, reproducible code. >> > > > -- > Frank E Harrell Jr ? Professor and Chair ? ? ? ? ? School of Medicine > ? ? ? ? ? ? ? ? ? ? Department of Biostatistics ? Vanderbilt University >
Yet what's the benefit of using Design Package? Thanks! On Sun, Jun 14, 2009 at 8:04 PM, Frank E Harrell Jr<f.harrell at vanderbilt.edu> wrote:> Also it would be useful to compare glm with the lrm function in the Design > package, for speed and memory use. > > Frank > > > David Winsemius wrote: >> >> On Jun 14, 2009, at 9:06 PM, Michael wrote: >> >>> Hi all, >>> >>> I am getting the following error message: >>> >>>> mymodel = glm(response ~ . , family=binomial, data=C); >>> >>> Error: cannot allocate vector of size 734.2 Mb >>> In addition: Warning messages: >>> 1: In array(0, c(n, n), list(levs, levs)) : >>> ?Reached total allocation of 1535Mb: see help(memory.size) >>> 2: In array(0, c(n, n), list(levs, levs)) : >>> ?Reached total allocation of 1535Mb: see help(memory.size) >>> 3: In array(0, c(n, n), list(levs, levs)) : >>> ?Reached total allocation of 1535Mb: see help(memory.size) >>> 4: In array(0, c(n, n), list(levs, levs)) : >>> ?Reached total allocation of 1535Mb: see help(memory.size) >>> >>> ----------- >>> >>> The data frame is 60000 x 20, >>> >>> is it too large for R? >>> >>> What shall I do? Will close all other softwares/applications help? My >>> PC is Vista with 4GB memory. Thank you. >> >> It's certainly not too large for R. Have you looked at the R Windows FAQ >> on the topic? >> >> >> http://cran.r-project.org/bin/windows/base/rw-FAQ.html#There-seems-to-be-a-limit-on-the-memory-it-uses_0021 >> >> ... and perhaps: >> >> http://finzi.psych.upenn.edu/Rhelp08/2008-August/171649.html >> >> >> David Winsemius, MD >> Heritage Laboratories >> West Hartford, CT >> >> ______________________________________________ >> R-help at r-project.org mailing list >> https://stat.ethz.ch/mailman/listinfo/r-help >> PLEASE do read the posting guide >> http://www.R-project.org/posting-guide.html >> and provide commented, minimal, self-contained, reproducible code. >> > > > -- > Frank E Harrell Jr ? Professor and Chair ? ? ? ? ? School of Medicine > ? ? ? ? ? ? ? ? ? ? Department of Biostatistics ? Vanderbilt University >