Is anyone could help me to resolve this problem?I'm presently an SAS user for my application and was exploring R to use it for my application.I have already posted this question on using my 32 bit machine with 2GB RAM and from what i understood was to use a 64Bit machine .I tried using 64bit machine using 4GB RAM .I'm running predictive analytics using R and to calibrate my model i used to adjust the variables used in the model and the problem happens here.R just runs out of memory .I tried garbage cleaning also. data APN condition quality site_zip sale_date sale_price estimate 1.1-1 good good 10201 1/1/07 $234,000 $254,000 1.5-1 average good 10201 1/1/08 $254,000 $276,000 1.6-1 poor poor 10202 1/1/06 $192,000 $199,000 1.7-1 good good 10202 1/1/07 $300,000 $305,000 Regression equation Sale_price=condition quality site_zip after running the above equation i will be getting the estimates and then i will calibrate the model using the dependent variables. For that purpose seperate dataset are created and run for 50 Iterations .Problem occurs here after running few iterations it shows out of space. I'm using R 2.10.0 If you need any other clarifications i shall provide the needed .Help me to solve this -- View this message in context: http://n4.nabble.com/R-Memory-Problem-tp1289221p1289221.html Sent from the R help mailing list archive at Nabble.com.
How big is your data set (use object.size on the object and 'str'). Exactly what statements are you executing? Exactly what error message are you getting? On Mon, Jan 25, 2010 at 5:44 AM, prem_R <mtechprem at gmail.com> wrote:> > Is anyone could help me to resolve this problem?I'm presently an SAS user for > my application and was exploring R to use it for my application.I have > already posted this question on using my 32 bit machine with 2GB RAM and > from what i understood was to use a 64Bit machine .I tried using 64bit > machine using 4GB RAM .I'm running predictive analytics using R and to > calibrate my model i used to adjust the variables used in the model and the > problem happens here.R just runs out of memory .I tried garbage cleaning > also. > data > APN ? ?condition ? ?quality ? ?site_zip ? sale_date ? ?sale_price > estimate > 1.1-1 ? good ? ? ? ? good ? ? ? 10201 ? ?1/1/07 ? ? ? ? $234,000 > $254,000 > 1.5-1 ? average ? ? good ? ? ? 10201 ? ?1/1/08 ? ? ? ? $254,000 > $276,000 > 1.6-1 ? ?poor ? ? ? ? ?poor ? ? ? 10202 ? ?1/1/06 ? ? ? ? $192,000 > $199,000 > 1.7-1 ? ?good ? ? ? ? good ? ? ? ?10202 ? ?1/1/07 ? ? ? ?$300,000 > $305,000 > > Regression equation > > Sale_price=condition quality site_zip > > after running the above equation i will be getting the estimates and > then i will calibrate the model using the dependent variables. > > For that purpose seperate dataset are created and run for 50 Iterations > .Problem occurs here after running few iterations it shows out of space. > > I'm using R 2.10.0 > > If you need any other clarifications i shall provide the needed .Help me to > solve this > -- > View this message in context: http://n4.nabble.com/R-Memory-Problem-tp1289221p1289221.html > Sent from the R help mailing list archive at Nabble.com. > > ______________________________________________ > R-help at r-project.org mailing list > https://stat.ethz.ch/mailman/listinfo/r-help > PLEASE do read the posting guide http://www.R-project.org/posting-guide.html > and provide commented, minimal, self-contained, reproducible code. >-- Jim Holtman Cincinnati, OH +1 513 646 9390 What is the problem that you are trying to solve?
prem_R <mtechprem at gmail.com> writes:> I'm running predictive analytics using R and to calibrate my model i > used to adjust the variables used in the model and the problem happens > here.R just runs out of memory .I tried garbage cleaning also.I'm analyzing a 8 GB data set using R, so it can certainly handle large data sets. It tends to copy data very often, however, so you have to be very careful with it. For example, if you modify a single column in a data frame, R will copy the entire data frame, rather than just replace the modified column. If you are running a regression that saves the input data in the model result object, and you are modifying the data frame between runs, then it would be very easy to have many copies of your data in memory at once. One solution would be not to keep the model result objects around. Another would be to manually modify them to strip out the data object. This can be tricky, however, since copies of the data may live on in the environments of saved functions; I had this problem with 'mgcv::gam' fits. I hope that helps. Regards, Johann
Yes i think this is explanation of the problem faced .Could you please help me to solve this . -- View this message in context: http://n4.nabble.com/R-Memory-Problem-tp1289221p1311291.html Sent from the R help mailing list archive at Nabble.com.
You were asked to provide details, but so far have not. -- David. On Jan 27, 2010, at 2:17 AM, prem_R wrote:> > Yes i think this is explanation of the problem faced .Could you > please > help me to solve this . > > -- > View this message in context: http://n4.nabble.com/R-Memory-Problem-tp1289221p1311291.html > Sent from the R help mailing list archive at Nabble.com. > > ______________________________________________ > R-help at r-project.org mailing list > https://stat.ethz.ch/mailman/listinfo/r-help > PLEASE do read the posting guide http://www.R-project.org/posting-guide.html > and provide commented, minimal, self-contained, reproducible code.