It will be helpful on this forum to use metric measures: 12 Lakh is 1.2
million, thus your data is 1.2 million observations x 15 variables. I do not
know the intricacies of R. You may have to wait for someone with that
knowledge to respond.
Including some relevant portions of error messages and code in your query
can also help someone to respond to your message.
Anupam.
-----Original Message-----
From: r-help-bounces at r-project.org [mailto:r-help-bounces at r-project.org]
On
Behalf Of Abhisek Saha
Sent: Saturday, June 11, 2011 6:25 AM
To: r-help at r-project.org
Subject: [R] Memory(RAM) issues
Dear All,
I have been working with R(desktop version) in VISTA. I have the latest
version R 2.13.0. I have been working with few data-sets of size 12Lakh * 15
and my code is quite computing intensive ( applying MCMC gibbs sampler on a
posterior of 144 variables) that often runs into a memory issue such as
memory can not allocate the vector ,full size(shows to have reached
something like 1.5 GB) reached or something to this effect. I have a RAM of
2GB. I checked with the option like memory.size and it says a 64 bit R if
sat on 64 bit windows then max memory capability is 8TB.
Now I don't have background to understand the definitions and differences
between 32 and 64 bit machines and other technical requirements like servor
etc but it would be of great help if anyone can let me have a feel of it.
Could any of you tell me whether some servor version of R would resolve my
issue or not (I am not sure now what kind of servor my company would allow R
to be installed at this point ,may be linux type) and if that's the case
could any of you guide me about how to go about installing that onto a
sevor.
Thank you,
Abhisek
[[alternative HTML version deleted]]
______________________________________________
R-help at r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
and provide commented, minimal, self-contained, reproducible code.