I have a loop that increases the size of an object after each iteration. When the Windows Task Manager shows "Mem Usage" about 1.8GB, the Rgui.exe process no longer responds. I use: "C:\Program Files\R\rw1080\bin\Rgui.exe" --max-mem-size=4000M --min-vsize=10M --max-vsize=3000M --min-nsize=500k --max-nsize=1000M I have a dual Xeon 2.8GHz processor box with 4GB of memory and "R version 1.8.0, 2003-10-08". Any suggestions or ideas would be greatly appreciated. Thanks, Dick ******************************************************************************* Richard P. Beyer, Ph.D. University of Washington Tel.:(206) 616 7378 Env. & Occ. Health Sci. , Box 354695 Fax: (206) 685 4696 4225 Roosevelt Way NE, # 100 Seattle, WA 98105-6099
Could you compile up and try R-devel (see the FAQ)? It probably will cope with more than 2Gb, and I've run it up to 2.5Gb. Note that an effective limit of 1.7Gb is mentioned in the rw-FAW. On Wed, 19 Nov 2003, Dick Beyer wrote:> I have a loop that increases the size of an object after each iteration. > When the Windows Task Manager shows "Mem Usage" about 1.8GB, the > Rgui.exe process no longer responds. > > I use: > > "C:\Program Files\R\rw1080\bin\Rgui.exe" --max-mem-size=4000M > --min-vsize=10M --max-vsize=3000M --min-nsize=500k --max-nsize=1000M > > I have a dual Xeon 2.8GHz processor box with 4GB of memory and "R > version 1.8.0, 2003-10-08". > > Any suggestions or ideas would be greatly appreciated.-- Brian D. Ripley, ripley at stats.ox.ac.uk Professor of Applied Statistics, http://www.stats.ox.ac.uk/~ripley/ University of Oxford, Tel: +44 1865 272861 (self) 1 South Parks Road, +44 1865 272866 (PA) Oxford OX1 3TG, UK Fax: +44 1865 272595
On Wed, 19 Nov 2003 10:20:16 -0800 (PST), Dick Beyer <dbeyer at u.washington.edu> wrote :>I have a loop that increases the size of an object after each iteration. When the Windows Task Manager shows "Mem Usage" about 1.8GB, the Rgui.exe process no longer responds. > >I use: > >"C:\Program Files\R\rw1080\bin\Rgui.exe" --max-mem-size=4000M --min-vsize=10M --max-vsize=3000M --min-nsize=500k --max-nsize=1000M > >I have a dual Xeon 2.8GHz processor box with 4GB of memory and "R version 1.8.0, 2003-10-08". > >Any suggestions or ideas would be greatly appreciated.Normally the maximum memory allowed for any process in Windows is 2 GB. It's possible to raise that to 3 GB but R 1.8 doesn't know how, so that's an absolute upper limit. Version 1.9 may be able to go up to 3 GB, but beyond that you'll probably need a 64 bit processor: as far as I know all the 32 bit OS's limit each process to 2 or 3 GB, because they reserve 1 or 2 GB for themselves. I don't know why you're hitting the ceiling at 1.8 GB, but it may be that there's unreported overhead. I also don't know why it's not failing gracefully. My only suggestion is to say "Don't do that". Duncan Murdoch
With a custom compiled kernel, I've run R processes that used more than 5GB of RAM on a Linux box with 8GB RAM and dual Xeons. So it seems to work on 32-bit Linux with big memory kernel. Andy> From: Duncan Murdoch [mailto:dmurdoch at pair.com][snip]> Normally the maximum memory allowed for any process in > Windows is 2 GB. It's possible to raise that to 3 GB but R > 1.8 doesn't know how, so that's an absolute upper limit. > Version 1.9 may be able to go up to 3 GB, but beyond that > you'll probably need a 64 bit processor: as far as I know > all the 32 bit OS's limit each process to 2 or 3 GB, because > they reserve 1 or 2 GB for themselves. >[snip]> Duncan Murdoch > > ______________________________________________ > R-help at stat.math.ethz.ch mailing list > https://www.stat.math.ethz.ch/mailman/listinfo> /r-help >
Sorry. I need to retract my claim. There seems to be a 3G limit, even though the OS could handle nearly 8G. (I can have two simultaneous R processes each using near 3G.) On another note, on our dual Opteron box R (compiled as 64-bit) could easily use nearly all the 16G in that box (that's one of the reason for having that box). Cheers, Andy> From: Paul Gilbert [mailto:pgilbert at bank-banque-canada.ca] > > Liaw, Andy wrote: > > With a custom compiled kernel, I've run R processes that > used more than 5GB > > of RAM on a Linux box with 8GB RAM and dual Xeons. So it > seems to work on > > 32-bit Linux with big memory kernel. > > > > Andy > > I'm curious about this. I believe the address space limit of a 32-bit > processor is 4G, and I thought Xeons were 32-bit processors. > How can a > single process exceed the address space? > > Thanks, > Paul Gilbert > > > > > > > >>From: Duncan Murdoch [mailto:dmurdoch at pair.com] > > > > [snip] > > > >>Normally the maximum memory allowed for any process in > >>Windows is 2 GB. It's possible to raise that to 3 GB but R > >>1.8 doesn't know how, so that's an absolute upper limit. > >>Version 1.9 may be able to go up to 3 GB, but beyond that > >>you'll probably need a 64 bit processor: as far as I know > >>all the 32 bit OS's limit each process to 2 or 3 GB, because > >>they reserve 1 or 2 GB for themselves. > >> > > > > [snip] > > > > > >>Duncan Murdoch > >> > >>______________________________________________ > >>R-help at stat.math.ethz.ch mailing list > >>https://www.stat.math.ethz.ch/mailman/listinfo> /r-help > >> > > > > > > ______________________________________________ > > R-help at stat.math.ethz.ch mailing list > > https://www.stat.math.ethz.ch/mailman/listinfo/r-help > > > >
Dear MacR users During the weekend I did a clean install of MacOSX 10.3.1, of Apples X11, of Apples development tools, and of the basic Fink 0.6.2 package. Now I have just downloaded Raqua.dmg from CRAN and installed the RAqua, libreadline and tcltk packages. Sorrowly, double-clicking on StartR has no effect, and starting R from the command line does not find R, even if I start R from /usr/local/bin. Does the Raqua binary work with Panther? (Why do I need to install tcltk, when it comes installed with Panther?) Thank you in advance Best regards Christian _._._._._._._._._._._._._._._._ C.h.i.s.t.i.a.n S.t.r.a.t.o.w.a V.i.e.n.n.a A.u.s.t.r.i.a _._._._._._._._._._._._._._._._
Dear MacR users Sorry for the earlier mail, now everything works really great. For some reason I could not start R immediately after installation, I had to log out first and then login again, then R did start from a really great R Console. Best regards Christian cstrato wrote:> Dear MacR users > > During the weekend I did a clean install of MacOSX 10.3.1, of Apples X11, > of Apples development tools, and of the basic Fink 0.6.2 package. > > Now I have just downloaded Raqua.dmg from CRAN and installed the > RAqua, libreadline and tcltk packages. > Sorrowly, double-clicking on StartR has no effect, and starting R from > the > command line does not find R, even if I start R from /usr/local/bin. > > Does the Raqua binary work with Panther? > (Why do I need to install tcltk, when it comes installed with Panther?) > > Thank you in advance > Best regards > Christian > _._._._._._._._._._._._._._._._ > C.h.i.s.t.i.a.n S.t.r.a.t.o.w.a > V.i.e.n.n.a A.u.s.t.r.i.a > _._._._._._._._._._._._._._._._ > >
Dear R experts This is a general question: Does R have functions for nonlinear robust regression, analogous to e.g. LTS? Searching google I have found 1, an abstract to generalize LTS for nonlinear regression models, see: http://smealsearch.psu.edu/1509.html 2, an AD-model builder, see: http://otter-rsch.com/admodel/cc1.html but no mention of R/S Thank you in advance Best regards Christian _._._._._._._._._._._._._._._._ C.h.i.s.t.i.a.n S.t.r.a.t.o.w.a V.i.e.n.n.a A.u.s.t.r.i.a _._._._._._._._._._._._._._._._
Dear all Since I did not receive any answer to my general question (?), let me ask a concrete question: How can I fit the simple function y = a*sin(x)/b*x? This is the code that I tried, but nls gives an error: x <- seq(1,10,0.1) y <- sin(x)/x plot(x,y) z <- jitter(y,amount=0.1) plot(x,z) df <- as.data.frame(cbind(x,z)) nf <- nls(z ~ a*sin(x)/b*x, data=df, start=list(a=0.8,b=0.9), trace = TRUE) I have followed the Puromycin sample which works fine: Pur.wt <- nls(rate ~ (Vm * conc)/(K + conc), data = Treated, start = list(Vm = 200, K = 0.1), trace = TRUE) Do I make some mistake or is it not possible to fit sin(x)/x? Thank you in advance Best regards Christian _._._._._._._._._._._._._._._._ C.h.i.s.t.i.a.n S.t.r.a.t.o.w.a V.i.e.n.n.a A.u.s.t.r.i.a _._._._._._._._._._._._._._._._ cstrato wrote:> Dear R experts > > This is a general question: > Does R have functions for nonlinear robust regression, > analogous to e.g. LTS? > > Searching google I have found > 1, an abstract to generalize LTS for nonlinear regression > models, see: http://smealsearch.psu.edu/1509.html > 2, an AD-model builder, see: http://otter-rsch.com/admodel/cc1.html > but no mention of R/S > > Thank you in advance > Best regards > Christian > _._._._._._._._._._._._._._._._ > C.h.i.s.t.i.a.n S.t.r.a.t.o.w.a > V.i.e.n.n.a A.u.s.t.r.i.a > _._._._._._._._._._._._._._._._ > > ______________________________________________ > R-help at stat.math.ethz.ch mailing list > https://www.stat.math.ethz.ch/mailman/listinfo/r-help > PLEASE do read the posting guide! > http://www.R-project.org/posting-guide.html > >
Dear all Here is a hopefully better example with regards to nonlinear robust fitting: # fitting a polynomial: x <- seq(-10,10,0.2) y <- 10*x + 4*x*x - 2*x*x*x plot(x,y) z <- jitter(y,amount=300) plot(x,z) df <- as.data.frame(cbind(x,z)) nf <- nls(z ~ a*x + b*x*x + c*x*x*x, data=df, + start=list(a=4,b=2,c=1), trace = TRUE) 127697531 : 4 2 1 2974480 : 10.972123 3.793426 -1.942278 # introducing outliers before fitting the polynomial: z1 <- z z1[c(16,22,23,34,36,42,67,69,72,76)] <- + c(2000,1900,2000,1900,1600,1600,500,-2000,-1700,-1800) plot(x,z1) df1 <- as.data.frame(cbind(x,z1)) nf1 <- nls(z1 ~ a*x + b*x*x + c*x*x*x, data=df1, + start=list(a=4,b=2,c=1), trace = TRUE) 159359174 : 4 2 1 24098548 : -59.053288 4.169518 -1.072027 # plotting the results: y1 <- 10.97*x + 3.79*x*x - 1.94*x*x*x y2 <- -59.05*x + 4.17*x*x - 1.07*x*x*x oldpar <- par(pty="s",mfrow=c(2,2),mar=c(5,5,4,1)) plot(x,y) plot(x,z1) plot(x,y1) plot(x,y2) par(oldpar) In my opinion this fit could hardly be considered to be robust. Are there functions in R which can do robust fitting? (Sorrowly, at the moment I could not test the package nlrq mentioned by Roger Koenker) Best regards Christian _._._._._._._._._._._._._._._._ C.h.i.s.t.i.a.n S.t.r.a.t.o.w.a V.i.e.n.n.a A.u.s.t.r.i.a _._._._._._._._._._._._._._._._
1. The question of "linear" vs. "nonlinear" means "linear in the parameters to be estimated. All the examples you have given so far are linear in the parameters to be estimated. The fact that they are nonlinear in "x" is immaterial. 2. With this hint and the posting guide "http://www.R-project.org/posting-guide.html", you may find more information. A search there exposed much discussion of "robust regression" and even "robust nonlinear regression", if you actually still need that. In addition, I found useful information on robust regression in Venables and Ripley (2002) Modern Applied Statistics with S, 4th ed. (Springer). hope this helps. spencer graves cstrato wrote:> Dear all > > Here is a hopefully better example with regards to > nonlinear robust fitting: > > # fitting a polynomial: > x <- seq(-10,10,0.2) > y <- 10*x + 4*x*x - 2*x*x*x > plot(x,y) > z <- jitter(y,amount=300) > plot(x,z) > df <- as.data.frame(cbind(x,z)) > nf <- nls(z ~ a*x + b*x*x + c*x*x*x, data=df, > + start=list(a=4,b=2,c=1), trace = TRUE) > 127697531 : 4 2 1 > 2974480 : 10.972123 3.793426 -1.942278 > > # introducing outliers before fitting the polynomial: > z1 <- z > z1[c(16,22,23,34,36,42,67,69,72,76)] <- > + c(2000,1900,2000,1900,1600,1600,500,-2000,-1700,-1800) > plot(x,z1) > df1 <- as.data.frame(cbind(x,z1)) > nf1 <- nls(z1 ~ a*x + b*x*x + c*x*x*x, data=df1, > + start=list(a=4,b=2,c=1), trace = TRUE) > 159359174 : 4 2 1 > 24098548 : -59.053288 4.169518 -1.072027 > > # plotting the results: > y1 <- 10.97*x + 3.79*x*x - 1.94*x*x*x > y2 <- -59.05*x + 4.17*x*x - 1.07*x*x*x > oldpar <- par(pty="s",mfrow=c(2,2),mar=c(5,5,4,1)) > plot(x,y) > plot(x,z1) > plot(x,y1) > plot(x,y2) > par(oldpar) > > In my opinion this fit could hardly be considered > to be robust. > > Are there functions in R which can do robust fitting? > (Sorrowly, at the moment I could not test the package > nlrq mentioned by Roger Koenker) > > Best regards > Christian > _._._._._._._._._._._._._._._._ > C.h.i.s.t.i.a.n S.t.r.a.t.o.w.a > V.i.e.n.n.a A.u.s.t.r.i.a > _._._._._._._._._._._._._._._._ > > ______________________________________________ > R-help at stat.math.ethz.ch mailing list > https://www.stat.math.ethz.ch/mailman/listinfo/r-help > PLEASE do read the posting guide! > http://www.R-project.org/posting-guide.html
Dear Spencer and all As you see, I have changed the subject title, because at the moment this was my interest. ad 2, I am checking always MASS first. ad 1, As mentioned above, I wanted to do a robust fit of a nonlinear function, although robust nonlinear regression is also of interest to me. Thank you all for your replies, especially Sundar Dorai-Raj, who gave the final hint: lf <- lm(z ~ x + I(x^2) + I(x^3) - 1, data = df) lf Coefficients: x I(x^2) I(x^3) 10.972 3.793 -1.942 lf1 <- lm(z1 ~ x + I(x^2) + I(x^3) - 1, data = df1) lf1 Coefficients: x I(x^2) I(x^3) -59.053 4.170 -1.072 Now, using rlm from MASS gives the following results: rlf <- rlm(z ~ x + I(x^2) + I(x^3) - 1, data = df) rlf Converged in 3 iterations Coefficients: x I(x^2) I(x^3) 11.118137 3.793672 -1.943496 rlf1 <- rlm(z1 ~ x + I(x^2) + I(x^3) - 1, data = df1) rlf1 Converged in 5 iterations Coefficients: x I(x^2) I(x^3) -2.169452 3.826027 -1.778487 Comparing lm and rlm reveals that rlm is able to handle outliers much better than lm. Best regards Christian Spencer Graves wrote:> 1. The question of "linear" vs. "nonlinear" means "linear in the > parameters to be estimated. All the examples you have given so far are > linear in the parameters to be estimated. The fact that they are > nonlinear in "x" is immaterial. > 2. With this hint and the posting guide > "http://www.R-project.org/posting-guide.html", you may find more > information. A search there exposed much discussion of "robust > regression" and even "robust nonlinear regression", if you actually > still need that. In addition, I found useful information on robust > regression in Venables and Ripley (2002) Modern Applied Statistics with > S, 4th ed. (Springer). > hope this helps. spencer graves > > cstrato wrote: > >> Dear all >> >> Here is a hopefully better example with regards to >> nonlinear robust fitting: >> >> # fitting a polynomial: >> x <- seq(-10,10,0.2) >> y <- 10*x + 4*x*x - 2*x*x*x >> plot(x,y) >> z <- jitter(y,amount=300) >> plot(x,z) >> df <- as.data.frame(cbind(x,z)) >> nf <- nls(z ~ a*x + b*x*x + c*x*x*x, data=df, >> + start=list(a=4,b=2,c=1), trace = TRUE) >> 127697531 : 4 2 1 >> 2974480 : 10.972123 3.793426 -1.942278 >> >> # introducing outliers before fitting the polynomial: >> z1 <- z >> z1[c(16,22,23,34,36,42,67,69,72,76)] <- >> + c(2000,1900,2000,1900,1600,1600,500,-2000,-1700,-1800) >> plot(x,z1) >> df1 <- as.data.frame(cbind(x,z1)) >> nf1 <- nls(z1 ~ a*x + b*x*x + c*x*x*x, data=df1, >> + start=list(a=4,b=2,c=1), trace = TRUE) >> 159359174 : 4 2 1 >> 24098548 : -59.053288 4.169518 -1.072027 >> >> # plotting the results: >> y1 <- 10.97*x + 3.79*x*x - 1.94*x*x*x >> y2 <- -59.05*x + 4.17*x*x - 1.07*x*x*x >> oldpar <- par(pty="s",mfrow=c(2,2),mar=c(5,5,4,1)) >> plot(x,y) >> plot(x,z1) >> plot(x,y1) >> plot(x,y2) >> par(oldpar) >> >> In my opinion this fit could hardly be considered >> to be robust. >> >> Are there functions in R which can do robust fitting? >> (Sorrowly, at the moment I could not test the package >> nlrq mentioned by Roger Koenker) >> >> Best regards >> Christian >> _._._._._._._._._._._._._._._._ >> C.h.i.s.t.i.a.n S.t.r.a.t.o.w.a >> V.i.e.n.n.a A.u.s.t.r.i.a >> _._._._._._._._._._._._._._._._ >> >> ______________________________________________ >> R-help at stat.math.ethz.ch mailing list >> https://www.stat.math.ethz.ch/mailman/listinfo/r-help >> PLEASE do read the posting guide! >> http://www.R-project.org/posting-guide.html > > > > >