Hi Jay Emerson,
Our Intention is to primarily optimize "R" to utilize the
Parallel
Processing Capabilities of CELL BE Processor.(has any work been done in this
area?)
We have huge pages(of size 1MB 16MB ) available in the system and as you
pointed out our data is also in the GB ranges.So the idea is if Vectors of
this huge size are allocated from Huge Pages the performance will naturally
increase.How to implement it?
So How can we proceed in this case?
Also the Upper Limit of Class 6 is specified as 128 bytes(Node Classes)
Will there be any side effect in increasing this to 512 bytes or so
(depending on the average of the Application data)
Advance Thanks & Regards
R.Subramanian
On Tue, Jun 24, 2008 at 11:22 PM, Jay Emerson <jayemerson@gmail.com>
wrote:
> I saw your post to r-project... multiple MB of data isn't really a
> problem. Multiple GB of data certainly are. I'm wondering if you can
> clarify the problems you are having? It isn't clear to me that simply
> replacing malloc would do much. R has many strengths (convenience is
> one of them), but there are design features/quirks/limitations that
> are really distinct from the choice of malloc.
>
> Jay
>
> --
> John W. Emerson (Jay)
> Assistant Professor of Statistics
> Director of Graduate Studies
> Department of Statistics
> Yale University
> http://www.stat.yale.edu/~jay <http://www.stat.yale.edu/%7Ejay>
>
[[alternative HTML version deleted]]