On 10/06/2011 08:33 AM, Uwe Ligges wrote:>
>
> On 06.10.2011 16:49, Alaios wrote:
>> Dear all,
>> I have a few binary files like 9Gb or even of 50Gb.
Hi Alaios --
Maybe you have a particular domain you are interested in (e.g.,
high-throughput sequence analysis) and there are packages (e.g., at
http://bioconductor.org) that make it easier to work with this size and
format of data.
Martin
>> I would like to ask you what are the known limits of the R for the
>> data processing part, I have a system with a lot of RAM (500Gb)
>
> Really accessible from one core? Amazing.
>
>> but I am not sure about the "internal" limitations of the R.
How long
>> for example a vector can be?
>
> See
> ?"Memory-limits"
>
> Uwe Ligges
>
>
>
>>
>>
>> Could you please inform me for these internal R limitations?
>>
>> I would like to thank you in advance for your help
>>
>> B.R
>> Alex
>>
>> [[alternative HTML version deleted]]
>>
>> ______________________________________________
>> R-help at r-project.org mailing list
>> https://stat.ethz.ch/mailman/listinfo/r-help
>> PLEASE do read the posting guide
>> http://www.R-project.org/posting-guide.html
>> and provide commented, minimal, self-contained, reproducible code.
>
> ______________________________________________
> R-help at r-project.org mailing list
> https://stat.ethz.ch/mailman/listinfo/r-help
> PLEASE do read the posting guide
> http://www.R-project.org/posting-guide.html
> and provide commented, minimal, self-contained, reproducible code.
--
Computational Biology
Fred Hutchinson Cancer Research Center
1100 Fairview Ave. N. PO Box 19024 Seattle, WA 98109
Location: M1-B861
Telephone: 206 667-2793