similar to: running out of memory while running a VERY LARGE regression

Displaying 20 results from an estimated 80000 matches similar to: "running out of memory while running a VERY LARGE regression"

2008 Mar 24
1
Cannot allocate large vectors (running out of memory?)
Hi. As shown in the simplified example below, I'm having trouble allocating memory for large vectors, even though it would appear that there is more than enough memory available. That is, even with a memory limit of 1500 MB, R 2.6.1 (Win) will allocate memory for a first vector of 285 MB, but not for a second vector of the same size. Forcing garbage collection does not seem
2009 Oct 28
1
Easy method to set user-mode virtual memory space in Windows Vista and 7
I thought I'd share this with the list since it appears to provide a quick fix to some memory problems, and I haven't see it discussed in relation to R. To reallocate virtual memory from kernel-mode to user-mode in 32-bit Vista or Windows 7 one can use the increaseuserva boot option value. See http://msdn.microsoft.com/en-us/library/aa906211.aspx On my 4GB Vista machine, R is now able to
2007 Dec 14
1
RODBC, optimizing memory, "Error: cannot allocate vector of size 522 Kb".
I am using RODBC and "odbcConnect". I have successfully used odbcConnect to extract "modest" amounts of data from SQL. For convenience, (and maybe speed?) I wish, if possible, to extract larger amounts of data in a single query. (I am running R2.6.0 under a machine running Windows Small Business Server with 3mb of RAM. I run gc() prior to attempting the query. I have
2006 Jun 27
1
Memory available to 32-bit R app on 64-bit machine
I want to get a 64-bit machine/OS system so I can put 16Gb of RAM in it. As first I assumed that I would have to use the 64-bit version of R to make use of the 16Gb of RAM, which would mean that I would use the Linux version of R. But I have heard many posters say they run the 32-bit version of R on a 64-bit machine/OS. So my questions, in Windows 64-bit, how much memory would be available to
2005 Oct 18
1
Memory problems with large dataset in rpart
Dear helpers, I am a Dutch student from the Erasmus University. For my Bachelor thesis I have written a script in R using boosting by means of classification and regression trees. This script uses the function the predefined function rpart. My input file consists of about 4000 vectors each having 2210 dimensions. In the third iteration R complains of a lack of memory, although in each iteration
2007 Sep 19
3
Row-by-row regression on matrix
Folks, I have a 3000 x 4 matrix (y), which I need to regress row-by-row against a 4-vector (x) to create a matrix lm.y of intercepts and slopes. To illustrate: y <- matrix(rnorm(12000), ncol = 4) x <- c(1/12, 3/12, 6/12, 1) system.time(lm.y <- t(apply(y, 1, function(z) lm(z ~ x)$coefficient))) [1] 44.72 18.00 69.52 NA NA Takes more than a minute to do (and I need to do many
2018 Oct 01
1
unexpected memory.limit on windows in embedded R
Dear All, I'm linking R from another application and embedding it as described in the R-exts manual, i.e. with initialization done via Rf_initEmbeddedR. While everything works the same as in standalone R for Linux, under Windows I found a difference in the default memory.limit, which is fixed to 2GB (both win32 and win64) - compared to a limit in standalone R of 3.5GB for win32 and 16GB on
2002 Aug 09
1
LM: Least Squares on Large Datasets OR why lm() is designed the w ay it is
Hi, I have always been wondering why S-Plus/R can not fit a linear model to an arbitrary large data set given that, I thought, it should be pretty straightforward. Sometime ago I came across a reference to LM package, http://www.econ.uiuc.edu/~anovo/LM.html, by Roger Koenker and Alvaro Novo. So I thought here it is at last, but to my surprise this project hasn't made to the recommended
2009 Oct 28
2
regression on large file
Dear R community, I have a fairly large file with variables in rows. Every variable (thousands) needs to be regressed on a reference variable. The file is too big to load into R (or R gets too slow having done it) and I do now read in line by line with "scan" (see below) and write the results to out. Although improved, this is still very slow... Can someone please help me and suggest
2005 Nov 30
2
Too much memory cache being used while moving large file
System : CentOS 4.2 2.6.9-22.0.1.ELsmp System fully up-to-date. 3GB RAM 3ware 9000S card, with Raid5 array. I think that's about all relevant info ... Had file on disk (not array), attempted to mv file to array. Went fine till 2.4GB was copied, then it slowed down to a meg every few minutes. Free memory was ~50MB (Typically is 1.5-2GB), and cache was 2.5GB. Stopped the move, however cache
2006 Sep 03
1
Memory issues
Hi, I'm using R on Windows and upgraded the computer memory to 4GB, as R was telling me that it is out of memory (for making heatmaps). It still says that the maximum memory is 1024Mb, even if I increase it using memory.limit and memory.size. Is there a way to permanently increase R's memory quota to 4GB? Please help. Many thanks, -DS. [[alternative HTML version deleted]]
2009 Feb 18
2
Running out of memory when importing SPSS files
Hello R-help, I am trying to import a large dataset from SPSS into R. The SPSS file is in .SAV format and is about 1GB in size. I use read.spss to import the file and get an error saying that I have run out of memory. I am on a MAC OS X 10.5 system with 4GB of RAM. Monitoring the R process tells me that R runs out of memory when reaching about 3GB of RAM so I suppose the remaining 1GB is used up
2012 Apr 03
2
pairwise linear regression between two large datasets
Hi all, I am trying to perform some analysis on the residuals of pair-wise linear regressions between two large sets A with dimensions {k x m}, and B {k x n} . So I need to regress every column B[,j] of B on every column A[,i] of A and produce a matrix C with dimensions {m x n}, so that C[i,j] contains the z-score of the k-th (last) residual of the aforementioned linear regression. I have tried
2007 Nov 05
1
question about running out of memory on R
Hi all, R newbie, but have been reading the boards and learning lots. I have read all documents & list-serve responses I could about R and memory but still no answer. QUESTION: When the sizes of your objects exceeds your available RAM, R switches to Virtual Memory right? If so, why does it so often run out of memory and return an "unable to allocate XXX KB" error shortly after
2006 Jan 05
4
Q: R 2.2.1: Memory Management Issues?
Dear Developers: I have a question about memory management in R 2.2.1 and am wondering if you would be kind enough to help me understand what is going on. (It has been a few years since I have done software development on Windows, so I apologize in advance if these are easy questions.) ------------- MY SYSTEM ------------- I am currently using R (version 2.2.1) on a PC running Windows 2000
2009 Jun 30
1
Clearing out or reclaiming memory
Hello, Is there a command for freeing up the memory used by R in holding data tables? The structure of the procedure I have is as follows: 1) Read multiple txt files in using read.table(...). 2) Combine the read tables using rbind(...). 3) Attach the data using attach(...) and then use do a multiple regression using lm(...). So far so good, but when I then perform a further regression by
2008 Apr 29
2
Running regression (lm, lrm) 100+ times and saving the results as matrix
An undergraduate here, so do not hesitate to let me know if you feel that I''m heading in a wrong direction. I have a data frame containing panel data across 10 years (hence 120 months). I want to be able to run regression separately for each month (or year). The below shows how I ran the regression for each month, but I need to know how I would combine the regression results
2009 Jul 14
1
Interaction term in multiple regression
Hello All, Thank you for taking my question. I am looking for information on how R handles interaction terms in a multiple regression using the ?lm? command. I originally noticed something was unusual when my R output did not match the output from JMP for an identical test run previously. Both programs give identical results for the main test and if the models do not contain the interaction
2008 Jan 28
2
How to get out the t-test value matrix for a linear regression ?
Hi, all I've written some R script to calculate the linear regression of a matrix. Here below is my script: >x<-matrix(scan("h:/data/xxx.dat",0),nrow=46,ncol=561,byrow=TRUE) >year <- NULL >year <- cbind(year,as.matrix(x[,1])) >lm.sol<-lm(x~year) >xtrend<-coef(lm.sol)[2,] # get the matrix of regression coefficient >t.test<- ?
2010 Oct 12
2
Memory limit problem
Dear List, I am trying to?plot?bathymetry contours around the Hawaiian Islands using the package rgdal and PBSmapping.? I have run into a memory limit when trying to combine two fairly small objects using cbind().? I have increased the memory to 4GB, but am being told I can't allocate a vector of size 240 Kb.? I am running R 2.11.1 on a Dell Optiplex 760 with Windows XP.? I have pasted