On 12-02-10 9:12 AM, Djordje Bajic wrote:> Hi all,
>
> I am trying to fill a 904x904x904 array, but at some point of the loop R
> states that the 5.5Gb sized vector is too big to allocate. I have looked at
> packages such as "bigmemory", but I need help to decide which is
the best
> way to store such an object. It would be perfect to store it in this
"cube"
> form (for indexing and computation purpouses). If not possible, maybe the
> best is to store the 904 matrices separately and read them individually
> when needed?
>
> Never dealed with such a big dataset, so any help will be appreciated
>
> (R+ESS, Debian 64bit, 4Gb RAM, 4core)
I'd really recommend getting more RAM, so you can have the whole thing
loaded in memory. 16 Gb would be nice, but even 8Gb should make a
substantial difference. It's going to be too big to store as an array
since arrays have a limit of 2^31-1 entries, but you could store it as a
list of matrices, e.g.
x <- vector("list", 904)
for (i in 1:904)
x[[i]] <- matrix(0, 904,904)
and then refer to entry i,j,k as x[[i]][j,k].
Duncan Murdoch