Alexy Khrabrov wrote:> I've read in Phil Spector's new book that it's a good idea to
> preallocate a big matrix, like
>
> u <- matrix(0,nrow,ncol) # (1)
>
> Now, I read contents of a huge matrix from a Fortran binary dump.
>
> u <- readBin(con,what="double",n=nrow*ncol) # (2)
>
> If I do (1) and then (2), u is a vector, obviously it's either
> reallocated or its matrix nature is lost -- overridden? overwritten?
>
> Instead, I do it now as
>
> u <-
>
matrix(readBin(con,what="double",n=nrow*ncol),nrow=nrow,ncol=ncol) #
(3)
>
> What's going on with memory management here and what's the right
way
> to make it efficient -- and how to preallocate?
>
> After that, I'm saving u as R binary object in an rda file. Does it
> make sense to preallocate u before reading it back now from the rda
> file?
>
This kind of preallocation only makes sense for loops. In your case, it
does not yield benefits.
Anyway, if you really want to reassign things in u, you could say:
u[] <- readBin(con,what="double",n=nrow*ncol) # (2)
(empty brackets for indexing all elements)
Best wishes,
Uwe
> Cheers,
> Alexy
>
> ______________________________________________
> R-help at r-project.org mailing list
> https://stat.ethz.ch/mailman/listinfo/r-help
> PLEASE do read the posting guide
http://www.R-project.org/posting-guide.html
> and provide commented, minimal, self-contained, reproducible code.