Dear friends, I have to use a very large matrix. Something of the sort of matrix(80000,80000,n) .... where n is something numeric of the sort 0.xxxxxx I have not found a way of doing it. I keep getting the error Error in matrix(nrow = 80000, ncol = 80000, 0.2) : too many elements specified Any suggestions? I have searched the mailing list, but to no avail. Best, -- Corrado Topi Global Climate Change & Biodiversity Indicators Area 18,Department of Biology University of York, York, YO10 5YW, UK Phone: + 44 (0) 1904 328645, E-mail: ct529 at york.ac.uk
> I have to use a very large matrix. Something of the sort of > matrix(80000,80000,n) .... where n is something numeric of the sort 0.xxxxxx > > I have not found a way of doing it. I keep getting the error > > Error in matrix(nrow = 80000, ncol = 80000, 0.2) : too many elements specified > > Any suggestions? I have searched the mailing list, but to no avail.A 80000x80000 matrix has 6.4 billion cells. If you assume 4 byte (32bit) for a double precision floating point number that's an impressive 25.6 Gb. Certainly does not fit into RAM on my machine. cu Philipp -- Dr. Philipp Pagel Lehrstuhl f?r Genomorientierte Bioinformatik Technische Universit?t M?nchen Wissenschaftszentrum Weihenstephan 85350 Freising, Germany http://mips.gsf.de/staff/pagel
looks like you've run out of memory mate, because that sure is a big matrix, you'd probably need 64 bit OS/CPU/R and loads of RAM. see thread: http://www.nabble.com/Error-in-matrix:--too-many-elements-specified-td20457910.html I know there are some packages on cran which help with large datasets, but i haven't got around to using them myself (yet). from: http://cran.r-project.org/web/views/HighPerformanceComputing.html [quote start...] Large memory and out-of-memory data * The bigmemory package by Kane and Emerson permits storing large objects such as matrices in memory and uses external pointer objects to refer to them. This permits transparent access from R without bumping against R's internal memory limits. Several R processes on the same computer can also shared big memory objects. * A large number of database packages, and database-alike packages (such as sqldf by Grothendieck) are also of potential interest but not (yet?) reviewed here. [quote end.] Just for fun, i just tried (for the first time):> library(bigmemory) > big.matrix(nrow=80000, ncol=80000, type='double')but i ended up crashing out of R on my low spec windows XP uni laptop :D hope that helps a little, Tony Breyal On 25 Feb, 11:04, Corrado <ct... at york.ac.uk> wrote:> Dear friends, > > I have to use a very large matrix. Something of the sort of > matrix(80000,80000,n) .... where n is something numeric of the sort 0.xxxxxx > > I have not found a way of doing it. I keep getting the error > > Error in matrix(nrow = 80000, ncol = 80000, 0.2) : too many elements specified > > Any suggestions? I have searched the mailing list, but to no avail. > > Best, > -- > Corrado Topi > > Global Climate Change & Biodiversity Indicators > Area 18,Department of Biology > University of York, York, YO10 5YW, UK > Phone: + 44 (0) 1904 328645, E-mail: ct... at york.ac.uk > > ______________________________________________ > R-h... at r-project.org mailing listhttps://stat.ethz.ch/mailman/listinfo/r-help > PLEASE do read the posting guidehttp://www.R-project.org/posting-guide.html > and provide commented, minimal, self-contained, reproducible code.
Corrado, Package bigmemory has undergone a major re-engineering and will be available soon (available now in Beta version upon request). The version currently on CRAN is probably of limited use unless you're in Linux. bigmemory may be useful to you for data management, at the very least, where x <- filebacked.big.matrix(80000, 80000, init=n, type="double") would accomplish what you want using filebacking (disk space) to hold the object. But even this requires 64-bit R (Linux or Mac, or perhaps a Beta version of Windows 64-bit R that REvolution Computing is working on). Subsequent operations (e.g. extraction of a small portion for analysis) are then easy enough: y <- x[1,] would give you the first row of x as an object y in R. Note that x is not itself an R matrix, and most existing R analytics can't work on x directly (and would max out the RAM if they tried, anyway). Feel free to email me for more information (and this invitation applies to anyone who is interested in this). Cheers, Jay #Dear friends, # #I have to use a very large matrix. Something of the sort of #matrix(80000,80000,n) .... where n is something numeric of the sort 0.xxxxxx # #I have not found a way of doing it. I keep getting the error # #Error in matrix(nrow = 80000, ncol = 80000, 0.2) : too many elements specified # #Any suggestions? I have searched the mailing list, but to no avail. # #Best, #-- #Corrado Topi # #Global Climate Change & Biodiversity Indicators #Area 18,Department of Biology #University of York, York, YO10 5YW, UK #Phone: + 44 (0) 1904 328645, E-mail: ct529 at york.ac.uk -- John W. Emerson (Jay) Assistant Professor of Statistics Department of Statistics Yale University http://www.stat.yale.edu/~jay
Apparently Analagous Threads
- Error "singular gradient matrix at initial parameter estimates" in nls
- Strange error returned or bug in gam in mgcv????
- A point in a vector?
- From THE R BOOK -> Warning: In eval(expr, envir, enclos) : non-integer #successes in a binomial glm!
- goodness of "prediction" using a model (lm, glm, gam, brt, regression tree .... )