Look at the task view for High Performance Computing (
http://cran.r-project.org/web/views/HighPerformanceComputing.html) there is
a section on packages for large memory and out-of-memory analyses. There
are also sections on parallel computing which is one way to deal with large
data if you have access to the right type of cluster (part of the dataset
lives on each machine so can fit in the local memory).
On Tue, Nov 25, 2014 at 6:06 AM, jeeth ghambole <jeethghambole at
gmail.com>
wrote:
> Hello All,
>
> I am working on BackTesting Strategies on stocks using daily prices.
>
> Initially the size of data was very limited and can be easily handled using
> R and SQL, but now my analysis has been extending on large set of data. Can
> anyone suggest me the best packages available for handling large datasets.
>
> Thank you.
>
> With Regards,
> Jeeth G.
>
> [[alternative HTML version deleted]]
>
> ______________________________________________
> R-help at r-project.org mailing list
> https://stat.ethz.ch/mailman/listinfo/r-help
> PLEASE do read the posting guide
> http://www.R-project.org/posting-guide.html
> and provide commented, minimal, self-contained, reproducible code.
>
--
Gregory (Greg) L. Snow Ph.D.
538280 at gmail.com
[[alternative HTML version deleted]]